mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2025-02-02 01:14:55 -05:00
Fixed some formatting issues with version 1.5
This commit is contained in:
parent
2c4757b429
commit
d83d55c722
Binary file not shown.
Binary file not shown.
@ -20,82 +20,6 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -1,8 +1,6 @@
|
||||
# Incident I00063: Olympic Doping Scandal
|
||||
|
||||
* **Summary:** On 18 July 2016, Richard McLaren, a Canadian attorney retained by WADA to investigate Grigor Rodchenkov (the former head of Russia's national anti-doping laboratory, the Anti-Doping Center, which was suspended by the World Anti-Doping Agency (WADA) in November 2015 for facilitating Russia's elaborate state-sponsored doping program. Rodchenkov helped develop and distribute banned performance-enhancing substances for thousands of Russian Olympians from 2005 to 2015. He made headlines in 2016 as a whistleblower, helping expose the complex and extensive nature of Russia's doping program. His revelations lead to Russia's partial ban from the 2016 Summer Olympics and total ban from the 2018 Winter Olympics.). These allegations led to a 97-page report covering significant state-sponsored doping in Russia.
|
||||
|
||||
The investigation found corroborating evidence after conducting witness interviews, reviewing thousands of documents, analysis of hard drives, forensic analysis of urine sample collection bottles, and laboratory analysis of individual athlete samples, with "more evidence becoming available by the day." The report concluded that it was shown "beyond a reasonable doubt" that Russia's Ministry of Sport, the Centre of Sports Preparation of the National Teams of Russia, the Federal Security Service (FSB), and the WADA-accredited laboratory in Moscow had "operated for the protection of doped Russian athletes" within a "state-directed failsafe system" using "the disappearing positive [test] methodology" (DPM) after the country's poor medal count during the 2010 Winter Olympic Games in Vancouver. McLaren stated that urine samples were opened in Sochi in order to swap them "without any evidence to the untrained eye". The official producer of BEREG-KIT security bottles used for anti-doping tests, Berlinger Group, stated, "We have no knowledge of the specifications, the methods or the procedures involved in the tests and experiments conducted by the McLaren Commission."
|
||||
* **Summary:** On 18 July 2016, Richard McLaren, a Canadian attorney retained by WADA to investigate Grigor Rodchenkov (the former head of Russia's national anti-doping laboratory, the Anti-Doping Center, which was suspended by the World Anti-Doping Agency (WADA) in November 2015 for facilitating Russia's elaborate state-sponsored doping program. Rodchenkov helped develop and distribute banned performance-enhancing substances for thousands of Russian Olympians from 2005 to 2015. He made headlines in 2016 as a whistleblower, helping expose the complex and extensive nature of Russia's doping program. His revelations lead to Russia's partial ban from the 2016 Summer Olympics and total ban from the 2018 Winter Olympics.). These allegations led to a 97-page report covering significant state-sponsored doping in Russia. <br><br>The investigation found corroborating evidence after conducting witness interviews, reviewing thousands of documents, analysis of hard drives, forensic analysis of urine sample collection bottles, and laboratory analysis of individual athlete samples, with "more evidence becoming available by the day." The report concluded that it was shown "beyond a reasonable doubt" that Russia's Ministry of Sport, the Centre of Sports Preparation of the National Teams of Russia, the Federal Security Service (FSB), and the WADA-accredited laboratory in Moscow had "operated for the protection of doped Russian athletes" within a "state-directed failsafe system" using "the disappearing positive [test] methodology" (DPM) after the country's poor medal count during the 2010 Winter Olympic Games in Vancouver. McLaren stated that urine samples were opened in Sochi in order to swap them "without any evidence to the untrained eye". The official producer of BEREG-KIT security bottles used for anti-doping tests, Berlinger Group, stated, "We have no knowledge of the specifications, the methods or the procedures involved in the tests and experiments conducted by the McLaren Commission.”
|
||||
|
||||
* **incident type**: campaign
|
||||
|
||||
|
@ -21,11 +21,7 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0104.002 Dating App](../../generated_pages/techniques/T0104.002.md) | IT00000214 _"In the days leading up to the UK’s [2017] general election, youths looking for love online encountered a whole new kind of Tinder nightmare. A group of young activists built a Tinder chatbot to co-opt profiles and persuade swing voters to support Labour. The bot accounts sent 30,000-40,000 messages to targeted 18-25 year olds in battleground constituencies like Dudley North, which Labour ended up winning by only 22 votes._ <br /> <br />
|
||||
|
||||
_"Tinder is a dating app where users swipe right to indicate attraction and interest in a potential partner. If both people swipe right on each other’s profile, a dialogue box becomes available for them to privately chat. After meeting their crowdfunding goal of only £500, the team built a tool which took over and operated the accounts of recruited Tinder-users. By upgrading the profiles to Tinder Premium, the team was able to place bots in any contested constituency across the UK. Once planted, the bots swiped right on all users in the attempt to get the largest number of matches and inquire into their voting intentions."_ <br /> <br />
|
||||
|
||||
This incident matches T0104.002: Dating App, as users of Tinder were targeted in an attempt to persuade users to vote for a particular party in the upcoming election, rather than for the purpose of connecting those who were authentically interested in dating each other. |
|
||||
| [T0104.002 Dating App](../../generated_pages/techniques/T0104.002.md) | IT00000214 _"In the days leading up to the UK’s [2017] general election, youths looking for love online encountered a whole new kind of Tinder nightmare. A group of young activists built a Tinder chatbot to co-opt profiles and persuade swing voters to support Labour. The bot accounts sent 30,000-40,000 messages to targeted 18-25 year olds in battleground constituencies like Dudley North, which Labour ended up winning by only 22 votes._<br /> <br />_"Tinder is a dating app where users swipe right to indicate attraction and interest in a potential partner. If both people swipe right on each other’s profile, a dialogue box becomes available for them to privately chat. After meeting their crowdfunding goal of only £500, the team built a tool which took over and operated the accounts of recruited Tinder-users. By upgrading the profiles to Tinder Premium, the team was able to place bots in any contested constituency across the UK. Once planted, the bots swiped right on all users in the attempt to get the largest number of matches and inquire into their voting intentions."_ <br /> <br />This incident matches T0104.002: Dating App, as users of Tinder were targeted in an attempt to persuade users to vote for a particular party in the upcoming election, rather than for the purpose of connecting those who were authentically interested in dating each other. |
|
||||
| [T0141.001 Acquire Compromised Account](../../generated_pages/techniques/T0141.001.md) | IT00000240 <I>“In the days leading up to the UK’s [2019] general election, youths looking for love online encountered a whole new kind of Tinder nightmare. A group of young activists built a Tinder chatbot to co-opt profiles and persuade swing voters to support Labour. The bot accounts sent 30,000-40,000 messages to targeted 18-25 year olds in battleground constituencies like Dudley North, which Labour ended up winning by only 22 votes. [...]<br><br> “The activists maintain that the project was meant to foster democratic engagement. But screenshots of the bots’ activity expose a harsher reality. Images of conversations between real users and these bots, posted on i-D, Mashable, as well as on Fowler and Goodman’s public Twitter accounts, show that the bots did not identify themselves as automated accounts, instead posing as the user whose profile they had taken over. While conducting research for this story, it turned out that a number of [the reporters’ friends] living in Oxford had interacted with the bot in the lead up to the election and had no idea that it was not a real person.”</i><br><br> In this example people offered up their real accounts for the automation of political messaging; the actors convinced the users to give up access to their accounts to use in the operation (T0141.001: Acquire Compromised Account). The actors maintained the accounts’ existing persona, and presented themselves as potential romantic suitors for legitimate platform users (T0097:109 Romantic Suitor Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000242 <I>“In the days leading up to the UK’s [2019] general election, youths looking for love online encountered a whole new kind of Tinder nightmare. A group of young activists built a Tinder chatbot to co-opt profiles and persuade swing voters to support Labour. The bot accounts sent 30,000-40,000 messages to targeted 18-25 year olds in battleground constituencies like Dudley North, which Labour ended up winning by only 22 votes. [...]<br><br> “The activists maintain that the project was meant to foster democratic engagement. But screenshots of the bots’ activity expose a harsher reality. Images of conversations between real users and these bots, posted on i-D, Mashable, as well as on Fowler and Goodman’s public Twitter accounts, show that the bots did not identify themselves as automated accounts, instead posing as the user whose profile they had taken over. While conducting research for this story, it turned out that a number of [the reporters’ friends] living in Oxford had interacted with the bot in the lead up to the election and had no idea that it was not a real person.”</i><br><br> In this example people offered up their real accounts for the automation of political messaging; the actors convinced the users to give up access to their accounts to use in the operation (T0141.001: Acquire Compromised Account). The actors maintained the accounts’ existing persona, and presented themselves as potential romantic suitors for legitimate platform users (T0097:109 Romantic Suitor Persona, T0143.003: Impersonated Persona). |
|
||||
|
||||
|
@ -21,7 +21,6 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0143.004 Parody Persona](../../generated_pages/techniques/T0143.004.md) | IT00000217 <i>“In France, in the lead-up to the 2017 election, we saw [the] labeling content as ‘‘satire” as a deliberate tactic. In one example, written up by Adrien Sénécat in Le Monde, it shows the step-by-step approach of those who want to use satire in this way.”<br><br> “PHASE 1: Le Gorafi, a satirical site [which focuses on news/current affairs], ‘‘reported” that French presidential candidate Emmanuel Macron feels dirty after touching poor people’s hands. This worked as an attack on Macron as he is regularly characterized as being out of touch and elitist.<br><br> “PHASE 2: Hyper-partisan Facebook Pages used this ‘‘claim” and created new reports, including footage of Macron visiting a factory, and wiping his hands during the visit.<br><br> “PHASE 3: The videos went viral, and a worker in another factory challenged Macron to shake his ‘‘dirty, working class hands.” The news cycle continued.”</I><br><br> In this example a satirical news website (T0097.202: News Outlet Persona, T0143.004: Parody Persona) published a narrative claiming Macron felt dirty after touching poor people’s hands. This story was uncritically amplified without the context that its origin was a parody site, and with video content appearing to support the narrative. |
|
||||
| [T0143.004 Parody Persona](../../generated_pages/techniques/T0143.004.md) | IT00000218 <i>“A 2019 case in the US involved a Republican political operative who created a parody site designed to look like Joe Biden’s official website as the former vice president was campaigning to be the Democratic nominee for the 2020 presidential election. With a URL of joebiden[.]info, the parody site was indexed by Google higher than Biden’s official site, joebiden[.]com, when he launched his campaign in April 2019. The operative, who previously had created content for Donald Trump, said he did not create the site for the Trump campaign directly.<br><br> “The opening line on the parody site reads: “Uncle Joe is back and ready to take a hands-on approach to America’s problems!” It is full of images of Biden kissing and hugging young girls and women. At the bottom of the page a statement reads: “This site is political commentary and parody of Joe Biden’s Presidential campaign website. This is not Joe Biden’s actual website. It is intended for entertainment and political commentary only.””</i><br><br> In this example a website was created which claimed to be a parody of Joe Biden’s official website (T0143.004: Parody Persona).<br><br> Although the website was a parody, it ranked higher than Joe Biden’s real website on Google search. |
|
||||
|
||||
|
||||
|
@ -15,19 +15,20 @@
|
||||
|
||||
| Reference | Pub Date | Authors | Org | Archive |
|
||||
| --------- | -------- | ------- | --- | ------- |
|
||||
| [https://cloud.google.com/blog/topics/threat-intelligence/untangling-iran-apt42-operations](https://cloud.google.com/blog/topics/threat-intelligence/untangling-iran-apt42-operations) | 2024/05/01 | Ofir Rozmann, Asli Koksal, Adrian Hernandez, Sarah Bock, Jonathan Leathery | Mendicant | [https://web.archive.org/web/20240619195456/https://cloud.google.com/blog/topics/threat-intelligence/untangling-iran-apt42-operations/](https://web.archive.org/web/20240619195456/https://cloud.google.com/blog/topics/threat-intelligence/untangling-iran-apt42-operations/) |
|
||||
| [https://cloud.google.com/blog/topics/threat-intelligence/untangling-iran-apt42-operations](https://cloud.google.com/blog/topics/threat-intelligence/untangling-iran-apt42-operations) | 2024/05/01 | Ofir Rozmann, Asli Koksal, Adrian Hernandez, Sarah Bock, Jonathan Leathery | Mandiant | [https://web.archive.org/web/20240619195456/https://cloud.google.com/blog/topics/threat-intelligence/untangling-iran-apt42-operations/](https://web.archive.org/web/20240619195456/https://cloud.google.com/blog/topics/threat-intelligence/untangling-iran-apt42-operations/) |
|
||||
|
||||
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0097.100 Individual Persona](../../generated_pages/techniques/T0097.100.md) | IT00000231 <I>“[Iranian state-sponsored cyber espionage actor] APT42 cloud operations attack lifecycle can be described in details as follows:<br> <br>- “Social engineering schemes involving decoys and trust building, which includes masquerading as legitimate NGOs and conducting ongoing correspondence with the target, sometimes lasting several weeks. <br>- The threat actor masqueraded as well-known international organizations in the legal and NGO fields and sent emails from domains typosquatting the original NGO domains, for example aspenlnstitute[.]org. <br>- The Aspen Institute became aware of this spoofed domain and collaborated with industry partners, including blocking it in SafeBrowsing, thus protecting users of Google Chrome and additional browsers. <br>- To increase their credibility, APT42 impersonated high-ranking personnel working at the aforementioned organizations when creating the email personas. <br>- APT42 enhanced their campaign credibility by using decoy material inviting targets to legitimate and relevant events and conferences. In one instance, the decoy material was hosted on an attacker-controlled SharePoint folder, accessible only after the victim entered their credentials. Mandiant did not identify malicious elements in the files, suggesting they were used solely to gain the victim’s trust.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, created a domain impersonating the existing NGO The Aspen Institute (T00143.004: Impersonated Persona, T0097.207: NGO Persona). They increased the perceived legitimacy of the impersonation by also impersonating high-ranking employees of the NGO (T0097.100: Individual Persona, T00143.004: Impersonated Persona). |
|
||||
| [T0097.100 Individual Persona](../../generated_pages/techniques/T0097.100.md) | IT00000231 <I>“[Iranian state-sponsored cyber espionage actor] APT42 cloud operations attack lifecycle can be described in details as follows:<br> <br>- “Social engineering schemes involving decoys and trust building, which includes masquerading as legitimate NGOs and conducting ongoing correspondence with the target, sometimes lasting several weeks. <br>- The threat actor masqueraded as well-known international organizations in the legal and NGO fields and sent emails from domains typosquatting the original NGO domains, for example aspenlnstitute[.]org. <br>- The Aspen Institute became aware of this spoofed domain and collaborated with industry partners, including blocking it in SafeBrowsing, thus protecting users of Google Chrome and additional browsers. <br>- To increase their credibility, APT42 impersonated high-ranking personnel working at the aforementioned organizations when creating the email personas. <br>- APT42 enhanced their campaign credibility by using decoy material inviting targets to legitimate and relevant events and conferences. In one instance, the decoy material was hosted on an attacker-controlled SharePoint folder, accessible only after the victim entered their credentials. Mandiant did not identify malicious elements in the files, suggesting they were used solely to gain the victim’s trust.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, created a domain impersonating the existing NGO The Aspen Institute (T0143.003: Impersonated Persona, T0097.207: NGO Persona). They increased the perceived legitimacy of the impersonation by also impersonating high-ranking employees of the NGO (T0097.100: Individual Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.103 Activist Persona](../../generated_pages/techniques/T0097.103.md) | IT00000227 <I>“In March 2023, [Iranian state-sponsored cyber espionage actor] APT42 sent a spear-phishing email with a fake Google Meet invitation, allegedly sent on behalf of Mona Louri, a likely fake persona leveraged by APT42, claiming to be a human rights activist and researcher. Upon entry, the user was presented with a fake Google Meet page and asked to enter their credentials, which were subsequently sent to the attackers.”</i><br><br>In this example APT42, an Iranian state-sponsored cyber espionage actor, created an account which presented as a human rights activist (T0097.103: Activist Persona) and researcher (T0097.107: Researcher Persona). The analysts assert that it was likely the persona was fabricated (T0143.002: Fabricated Persona) |
|
||||
| [T0097.107 Researcher Persona](../../generated_pages/techniques/T0097.107.md) | IT00000228 <I>“In March 2023, [Iranian state-sponsored cyber espionage actor] APT42 sent a spear-phishing email with a fake Google Meet invitation, allegedly sent on behalf of Mona Louri, a likely fake persona leveraged by APT42, claiming to be a human rights activist and researcher. Upon entry, the user was presented with a fake Google Meet page and asked to enter their credentials, which were subsequently sent to the attackers.”</i><br><br>In this example APT42, an Iranian state-sponsored cyber espionage actor, created an account which presented as a human rights activist (T0097.103: Activist Persona) and researcher (T0097.107: Researcher Persona). The analysts assert that it was likely the persona was fabricated (T0143.002: Fabricated Persona) |
|
||||
| [T0097.202 News Outlet Persona](../../generated_pages/techniques/T0097.202.md) | IT00000223 <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T00143.004: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [T0097.207 NGO Persona](../../generated_pages/techniques/T0097.207.md) | IT00000224 <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T00143.004: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [T0141.001 Acquire Compromised Account](../../generated_pages/techniques/T0141.001.md) | IT00000226 <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T00143.004: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [T0097.202 News Outlet Persona](../../generated_pages/techniques/T0097.202.md) | IT00000223 <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T0143.003: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [T0097.207 NGO Persona](../../generated_pages/techniques/T0097.207.md) | IT00000232 <I>“[Iranian state-sponsored cyber espionage actor] APT42 cloud operations attack lifecycle can be described in details as follows:<br> <br>- “Social engineering schemes involving decoys and trust building, which includes masquerading as legitimate NGOs and conducting ongoing correspondence with the target, sometimes lasting several weeks. <br>- The threat actor masqueraded as well-known international organizations in the legal and NGO fields and sent emails from domains typosquatting the original NGO domains, for example aspenlnstitute[.]org. <br>- The Aspen Institute became aware of this spoofed domain and collaborated with industry partners, including blocking it in SafeBrowsing, thus protecting users of Google Chrome and additional browsers. <br>- To increase their credibility, APT42 impersonated high-ranking personnel working at the aforementioned organizations when creating the email personas. <br>- APT42 enhanced their campaign credibility by using decoy material inviting targets to legitimate and relevant events and conferences. In one instance, the decoy material was hosted on an attacker-controlled SharePoint folder, accessible only after the victim entered their credentials. Mandiant did not identify malicious elements in the files, suggesting they were used solely to gain the victim’s trust.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, created a domain impersonating the existing NGO The Aspen Institute (T0143.003: Impersonated Persona, T0097.207: NGO Persona). They increased the perceived legitimacy of the impersonation by also impersonating high-ranking employees of the NGO (T0097.100: Individual Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0141.001 Acquire Compromised Account](../../generated_pages/techniques/T0141.001.md) | IT00000226 <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T0143.003: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000229 <I>“In March 2023, [Iranian state-sponsored cyber espionage actor] APT42 sent a spear-phishing email with a fake Google Meet invitation, allegedly sent on behalf of Mona Louri, a likely fake persona leveraged by APT42, claiming to be a human rights activist and researcher. Upon entry, the user was presented with a fake Google Meet page and asked to enter their credentials, which were subsequently sent to the attackers.”</i><br><br>In this example APT42, an Iranian state-sponsored cyber espionage actor, created an account which presented as a human rights activist (T0097.103: Activist Persona) and researcher (T0097.107: Researcher Persona). The analysts assert that it was likely the persona was fabricated (T0143.002: Fabricated Persona) |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000230 <I>“[Iranian state-sponsored cyber espionage actor] APT42 cloud operations attack lifecycle can be described in details as follows:<br> <br>- “Social engineering schemes involving decoys and trust building, which includes masquerading as legitimate NGOs and conducting ongoing correspondence with the target, sometimes lasting several weeks. <br>- The threat actor masqueraded as well-known international organizations in the legal and NGO fields and sent emails from domains typosquatting the original NGO domains, for example aspenlnstitute[.]org. <br>- The Aspen Institute became aware of this spoofed domain and collaborated with industry partners, including blocking it in SafeBrowsing, thus protecting users of Google Chrome and additional browsers. <br>- To increase their credibility, APT42 impersonated high-ranking personnel working at the aforementioned organizations when creating the email personas. <br>- APT42 enhanced their campaign credibility by using decoy material inviting targets to legitimate and relevant events and conferences. In one instance, the decoy material was hosted on an attacker-controlled SharePoint folder, accessible only after the victim entered their credentials. Mandiant did not identify malicious elements in the files, suggesting they were used solely to gain the victim’s trust.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, created a domain impersonating the existing NGO The Aspen Institute (T0143.003: Impersonated Persona, T0097.207: NGO Persona). They increased the perceived legitimacy of the impersonation by also impersonating high-ranking employees of the NGO (T0097.100: Individual Persona, T0143.003: Impersonated Persona). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -22,7 +22,7 @@
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0097.205 Business Persona](../../generated_pages/techniques/T0097.205.md) | IT00000234 <i>“Twitter Blue launched [November 2022], giving any users who pay $8 a month the ability to be verified on the site, a feature previously only available to public figures, government officials and journalists as a way to show they are who they claim to be.<br><br> “[A day after the launch], an account with the handle @EliLillyandCo labeled itself with the name “Eli Lilly and Company,” and by using the same logo as the company in its profile picture and with the verification checkmark, was indistinguishable from the real company (the picture has since been removed and the account has labeled itself as a parody profile).<br><br> The parody account tweeted “we are excited to announce insulin is free now.””</i><br><br> In this example an account impersonated the pharmaceutical company Eli Lilly (T0097.205: Business Persona, T0143.003: Impersonated Persona) by copying its name, profile picture (T0145.001: Copy Account Imagery), and paying for verification. |
|
||||
| [T0143.004 Parody Persona](../../generated_pages/techniques/T0143.004.md) | IT00000235 <i>“Twitter Blue launched [November 2022], giving any users who pay $8 a month the ability to be verified on the site, a feature previously only available to public figures, government officials and journalists as a way to show they are who they claim to be.<br><br> “[A day after the launch], an account with the handle @EliLillyandCo labeled itself with the name “Eli Lilly and Company,” and by using the same logo as the company in its profile picture and with the verification checkmark, was indistinguishable from the real company (the picture has since been removed and the account has labeled itself as a parody profile).<br><br> The parody account tweeted “we are excited to announce insulin is free now.””</i><br><br> In this example an account impersonated the pharmaceutical company Eli Lilly (T0097.205: Business Persona, T0143.003: Impersonated Persona) by copying its name, profile picture (T0145.001: Copy Account Imagery), and paying for verification. |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000235 <i>“Twitter Blue launched [November 2022], giving any users who pay $8 a month the ability to be verified on the site, a feature previously only available to public figures, government officials and journalists as a way to show they are who they claim to be.<br><br> “[A day after the launch], an account with the handle @EliLillyandCo labeled itself with the name “Eli Lilly and Company,” and by using the same logo as the company in its profile picture and with the verification checkmark, was indistinguishable from the real company (the picture has since been removed and the account has labeled itself as a parody profile).<br><br> The parody account tweeted “we are excited to announce insulin is free now.””</i><br><br> In this example an account impersonated the pharmaceutical company Eli Lilly (T0097.205: Business Persona, T0143.003: Impersonated Persona) by copying its name, profile picture (T0145.001: Copy Account Imagery), and paying for verification. |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000233 <i>“Twitter Blue launched [November 2022], giving any users who pay $8 a month the ability to be verified on the site, a feature previously only available to public figures, government officials and journalists as a way to show they are who they claim to be.<br><br> “[A day after the launch], an account with the handle @EliLillyandCo labeled itself with the name “Eli Lilly and Company,” and by using the same logo as the company in its profile picture and with the verification checkmark, was indistinguishable from the real company (the picture has since been removed and the account has labeled itself as a parody profile).<br><br> The parody account tweeted “we are excited to announce insulin is free now.””</i><br><br> In this example an account impersonated the pharmaceutical company Eli Lilly (T0097.205: Business Persona, T0143.003: Impersonated Persona) by copying its name, profile picture (T0145.001: Copy Account Imagery), and paying for verification. |
|
||||
|
||||
|
||||
|
@ -21,90 +21,13 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | IT00000324 <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.101 Local Persona](../../generated_pages/techniques/T0097.101.md) | IT00000238 <i>“The August 17 [2022] Telegram post [which contained a falsified letter from the Ukrainian Minister of Foreign Affairs asking Poland to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII] also contained screenshots of Facebook posts that appeared on two Facebook accounts belonging to Polish nationals Piotr Górka, an expert in the history of the Polish Air Force, and Dariusz Walusiak, a Polish historian and documentary maker. The Górka post suggested that he fully supported the Polish government’s decision to change Belwederska Street to Stepan Bandera Street.<br><br> “In a statement to the DFRLab, Górka said his account was accessed without his consent. “This is not my post loaded to my Facebook page,” he explained. “My site was hacked, some days ago.” At the time of publishing, Piotr Górka’s post and his Facebook account were no longer accessible.<br><br> “The post on Górka’s Facebook page was shared by Dariusz Walusiak’s Facebook account; the account also reposted it on the Facebook walls of more than twenty other Facebook users, including Adam Kalita, currently working at Krakow branch of the Institute of National Remembrance; Jan Kasprzyk, head of the Office for War Veterans and Victims of Oppression; and Alicja Kondraciuk, a Polish public figure living in Krakow.<br><br> “Walusiak’s Facebook account is also no longer accessible. Given his work on Polish history and identity, it seems highly unlikely he would support the Bandera measure; the DFRLab has also reached out to him for comment.<br><br> “The fact that Joker DPR’s Telegram post included screenshots of their Facebook posts raises the strong possibility that both Facebook accounts were compromised, and that hackers planted false statements on their pages that would seem out of character for them in order to gain further attention to the forged documents.”</I><br><br> In this example, threat actors used compromised accounts (T0141.001: Acquire Compromised Account) of Polish historians who have enough relevant knowledge to plausibly weigh in on the forged letter’s narrative (T0143.003: Impersonated Persona, T0097.101: Local Persona, T0097.108: Expert Persona). <br><br> This matches T0097.108: Expert Persona because the impersonation exploited Górka and Walusiak’s existing personas as experts in Polish history. |
|
||||
| [T0097.108 Expert Persona](../../generated_pages/techniques/T0097.108.md) | IT00000239 <i>“The August 17 [2022] Telegram post [which contained a falsified letter from the Ukrainian Minister of Foreign Affairs asking Poland to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII] also contained screenshots of Facebook posts that appeared on two Facebook accounts belonging to Polish nationals Piotr Górka, an expert in the history of the Polish Air Force, and Dariusz Walusiak, a Polish historian and documentary maker. The Górka post suggested that he fully supported the Polish government’s decision to change Belwederska Street to Stepan Bandera Street.<br><br> “In a statement to the DFRLab, Górka said his account was accessed without his consent. “This is not my post loaded to my Facebook page,” he explained. “My site was hacked, some days ago.” At the time of publishing, Piotr Górka’s post and his Facebook account were no longer accessible.<br><br> “The post on Górka’s Facebook page was shared by Dariusz Walusiak’s Facebook account; the account also reposted it on the Facebook walls of more than twenty other Facebook users, including Adam Kalita, currently working at Krakow branch of the Institute of National Remembrance; Jan Kasprzyk, head of the Office for War Veterans and Victims of Oppression; and Alicja Kondraciuk, a Polish public figure living in Krakow.<br><br> “Walusiak’s Facebook account is also no longer accessible. Given his work on Polish history and identity, it seems highly unlikely he would support the Bandera measure; the DFRLab has also reached out to him for comment.<br><br> “The fact that Joker DPR’s Telegram post included screenshots of their Facebook posts raises the strong possibility that both Facebook accounts were compromised, and that hackers planted false statements on their pages that would seem out of character for them in order to gain further attention to the forged documents.”</I><br><br> In this example, threat actors used compromised accounts (T0141.001: Acquire Compromised Account) of Polish historians who have enough relevant knowledge to plausibly weigh in on the forged letter’s narrative (T0143.003: Impersonated Persona, T0097.101: Local Persona, T0097.108: Expert Persona). <br><br> This matches T0097.108: Expert Persona because the impersonation exploited Górka and Walusiak’s existing personas as experts in Polish history. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | IT00000327 <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | IT00000326 <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0141.001 Acquire Compromised Account](../../generated_pages/techniques/T0141.001.md) | IT00000236 <i>“The August 17 [2022] Telegram post [which contained a falsified letter from the Ukrainian Minister of Foreign Affairs asking Poland to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII] also contained screenshots of Facebook posts that appeared on two Facebook accounts belonging to Polish nationals Piotr Górka, an expert in the history of the Polish Air Force, and Dariusz Walusiak, a Polish historian and documentary maker. The Górka post suggested that he fully supported the Polish government’s decision to change Belwederska Street to Stepan Bandera Street.<br><br> “In a statement to the DFRLab, Górka said his account was accessed without his consent. “This is not my post loaded to my Facebook page,” he explained. “My site was hacked, some days ago.” At the time of publishing, Piotr Górka’s post and his Facebook account were no longer accessible.<br><br> “The post on Górka’s Facebook page was shared by Dariusz Walusiak’s Facebook account; the account also reposted it on the Facebook walls of more than twenty other Facebook users, including Adam Kalita, currently working at Krakow branch of the Institute of National Remembrance; Jan Kasprzyk, head of the Office for War Veterans and Victims of Oppression; and Alicja Kondraciuk, a Polish public figure living in Krakow.<br><br> “Walusiak’s Facebook account is also no longer accessible. Given his work on Polish history and identity, it seems highly unlikely he would support the Bandera measure; the DFRLab has also reached out to him for comment.<br><br> “The fact that Joker DPR’s Telegram post included screenshots of their Facebook posts raises the strong possibility that both Facebook accounts were compromised, and that hackers planted false statements on their pages that would seem out of character for them in order to gain further attention to the forged documents.”</I><br><br> In this example, threat actors used compromised accounts (T0141.001: Acquire Compromised Account) of Polish historians who have enough relevant knowledge to plausibly weigh in on the forged letter’s narrative (T0143.003: Impersonated Persona, T0097.101: Local Persona, T0097.108: Expert Persona). <br><br> This matches T0097.108: Expert Persona because the impersonation exploited Górka and Walusiak’s existing personas as experts in Polish history. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000237 <i>“The August 17 [2022] Telegram post [which contained a falsified letter from the Ukrainian Minister of Foreign Affairs asking Poland to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII] also contained screenshots of Facebook posts that appeared on two Facebook accounts belonging to Polish nationals Piotr Górka, an expert in the history of the Polish Air Force, and Dariusz Walusiak, a Polish historian and documentary maker. The Górka post suggested that he fully supported the Polish government’s decision to change Belwederska Street to Stepan Bandera Street.<br><br> “In a statement to the DFRLab, Górka said his account was accessed without his consent. “This is not my post loaded to my Facebook page,” he explained. “My site was hacked, some days ago.” At the time of publishing, Piotr Górka’s post and his Facebook account were no longer accessible.<br><br> “The post on Górka’s Facebook page was shared by Dariusz Walusiak’s Facebook account; the account also reposted it on the Facebook walls of more than twenty other Facebook users, including Adam Kalita, currently working at Krakow branch of the Institute of National Remembrance; Jan Kasprzyk, head of the Office for War Veterans and Victims of Oppression; and Alicja Kondraciuk, a Polish public figure living in Krakow.<br><br> “Walusiak’s Facebook account is also no longer accessible. Given his work on Polish history and identity, it seems highly unlikely he would support the Bandera measure; the DFRLab has also reached out to him for comment.<br><br> “The fact that Joker DPR’s Telegram post included screenshots of their Facebook posts raises the strong possibility that both Facebook accounts were compromised, and that hackers planted false statements on their pages that would seem out of character for them in order to gain further attention to the forged documents.”</I><br><br> In this example, threat actors used compromised accounts (T0141.001: Acquire Compromised Account) of Polish historians who have enough relevant knowledge to plausibly weigh in on the forged letter’s narrative (T0143.003: Impersonated Persona, T0097.101: Local Persona, T0097.108: Expert Persona). <br><br> This matches T0097.108: Expert Persona because the impersonation exploited Górka and Walusiak’s existing personas as experts in Polish history. |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“On August 16, 2022, pro-Kremlin Telegram channel Joker DPR (Джокер ДНР) published a forged letter allegedly written by Ukrainian Foreign Minister Dmytro Kuleba. In the letter, Kuleba supposedly asked relevant Polish authorities to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII.<br><br> [...]<br><br> The letter is not dated, and Dmytro Kuleba’s signature seems to be copied from a publicly available letter signed by him in 2021.”</i><br><br> In this example the Telegram channel Joker DPR published a forged letter (T0085.004: Develop Document) in which they impersonated the Ukrainian Minister of Foreign Affairs (T0097.111: Government Official Persona, T0143.003: Impersonated Persona), using Ministry letterhead (T0097.206: Government Institution Persona, T0143.003: Impersonated Persona). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,7 +26,6 @@
|
||||
| [T0097.204 Think Tank Persona](../../generated_pages/techniques/T0097.204.md) | IT00000245 <I>“[Russia’s Internet Research Agency, the IRA] pushed narratives with longform blog content. They created media properties, websites designed to produce stories that would resonate with those targeted. It appears, based on the data set provided by Alphabet, that the IRA may have also expanded into think tank-style communiques. One such page, previously unattributed to the IRA but included in the Alphabet data, was GI Analytics, a geopolitics blog with an international masthead that included American authors. This page was promoted via AdWords and YouTube videos; it has strong ties to more traditional Russian propaganda networks, which will be discussed later in this analysis. GI Analytics wrote articles articulating nuanced academic positions on a variety of sophisticated topics. From the site’s About page:<br><br> ““Our purpose and mission are to provide high-quality analysis at a time when we are faced with a multitude of crises, a collapsing global economy, imperialist wars, environmental disasters, corporate greed, terrorism, deceit, GMO food, a migration crisis and a crackdown on small farmers and ranchers.””</i><br><br> In this example Alphabet’s technical indicators allowed them to assert that GI Analytics, which presented itself as a think tank, was a fabricated institution associated with Russia’s Internet Research Agency (T0097.204: Think Tank Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0097.208 Social Cause Persona](../../generated_pages/techniques/T0097.208.md) | IT00000251 <i>“The Black Matters Facebook Page [operated by Russia’s Internet Research Agency] explored several visual brand identities, moving from a plain logo to a gothic typeface on Jan 19th, 2016. On February 4th, 2016, the person who ran the Facebook Page announced the launch of the website, blackmattersus[.]com, emphasizing media distrust and a desire to build Black independent media; [“I DIDN’T BELIEVE THE MEDIA / SO I BECAME ONE”]”<i><br><br> In this example an asset controlled by Russia’s Internet Research Agency began to present itself as a source of “Black independent media”, claiming that the media could not be trusted (T0097.208: Social Cause Persona, T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0126.001 Call to Action to Attend](../../generated_pages/techniques/T0126.001.md) | IT00000247 <i>“A few press investigations have alluded to the [Russia’s Internet Research Agency]’s job ads. The extent of the human asset recruitment strategy is revealed in the organic data set. It is expansive, and was clearly a priority. Posts encouraging Americans to perform various types of tasks for IRA handlers appeared in Black, Left, and Right-targeted groups, though they were most numerous in the Black community. They included:<br> <br>- Requests for contact with preachers from Black churches (Black_Baptist_Church) <br>- Offers of free counsellingcounseling to people with sexual addiction (Army of Jesus) <br>- Soliciting volunteers to hand out fliers <br>- Soliciting volunteers to teach self-defense classes <br>- Offering free self-defense classes (Black Fist/Fit Black) <br>- Requests for followers to attend political rallies <br>- Requests for photographers to document protests <br>- Requests for speakers at protests <br>- Requests to protest the Westborough Baptist Church (LGBT United) <br>- Job offers for designers to help design fliers, sites, Facebook sticker packs <br>- Requests for female followers to send photos for a calendar <br>- Requests for followers to send photos to be shared to the Page (Back the Badge) <br>- Soliciting videos for a YouTube contest called “Pee on Hillary” <br>- Encouraging people to apply to be part of a Black reality TV show <br>- Posting a wide variety of job ads (write for BlackMattersUS and others) <br>- Requests for lawyers to volunteer to assist with immigration cases”</i> <br><br> This behaviour matches T0097.106: Recruiter Persona because the threat actors are presenting tasks for their target audience to complete in the style of a job posting (even though some of the tasks were presented as voluntary / unpaid efforts), including calls for people to attend political rallies (T0126.001: Call to Action to Attend). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000246 <I>“[Russia’s Internet Research Agency, the IRA] pushed narratives with longform blog content. They created media properties, websites designed to produce stories that would resonate with those targeted. It appears, based on the data set provided by Alphabet, that the IRA may have also expanded into think tank-style communiques. One such page, previously unattributed to the IRA but included in the Alphabet data, was GI Analytics, a geopolitics blog with an international masthead that included American authors. This page was promoted via AdWords and YouTube videos; it has strong ties to more traditional Russian propaganda networks, which will be discussed later in this analysis. GI Analytics wrote articles articulating nuanced academic positions on a variety of sophisticated topics. From the site’s About page:<br><br> ““Our purpose and mission are to provide high-quality analysis at a time when we are faced with a multitude of crises, a collapsing global economy, imperialist wars, environmental disasters, corporate greed, terrorism, deceit, GMO food, a migration crisis and a crackdown on small farmers and ranchers.””</i><br><br> In this example Alphabet’s technical indicators allowed them to assert that GI Analytics, which presented itself as a think tank, was a fabricated institution associated with Russia’s Internet Research Agency (T0097.204: Think Tank Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000249 <i>“The Black Matters Facebook Page [operated by Russia’s Internet Research Agency] explored several visual brand identities, moving from a plain logo to a gothic typeface on Jan 19th, 2016. On February 4th, 2016, the person who ran the Facebook Page announced the launch of the website, blackmattersus[.]com, emphasizing media distrust and a desire to build Black independent media; [“I DIDN’T BELIEVE THE MEDIA / SO I BECAME ONE”]”<i><br><br> In this example an asset controlled by Russia’s Internet Research Agency began to present itself as a source of “Black independent media”, claiming that the media could not be trusted (T0097.208: Social Cause Persona, T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
|
||||
|
||||
|
@ -21,86 +21,10 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | "<I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona)" |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | "<I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit" |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | IT00000328 <I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona) |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | IT00000331 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | IT00000330 <I>“But while Russia’s efforts [at election interference] in the United States fit Moscow’s campaign to upend Western democracy and rattle Mr. Putin’s geopolitical rivals, the undertaking in Madagascar often seemed to have a much simpler objective: profit.<br><br> “Before the election, a Russian company that local officials and foreign diplomats say is controlled by Mr. Prigozhin acquired a major stake in a government-run company that mines chromium, a mineral valued for its use in stainless steel. The acquisition set off protests by workers complaining of unpaid wages, cancelledcanceled benefits and foreign intrusion into a sector that had been a source of national pride for Madagascar.<br><br> “It repeated a pattern in which Russia has swooped into African nations, hoping to reshape their politics for material gain. In the Central African Republic, a former Russian intelligence officer is the top security adviser to the country’s president, while companies linked to Mr. Prigozhin have spread across the nation, snapping up diamonds in both legal and illegal ways, according to government officials, warlords in the diamond trade and registration documents showing Mr. Prigozhin’s growing military and commercial footprint.<br><br> [...] “The [operation switched from supporting the incumbent candidate on realising he would lose the election]. After the Russians pirouetted to help Mr. Rajoelina — their former opponent — win the election, Mr. Prigozhin’s company was able to negotiate with the new government to keep control of the chromium mining operation, despite the worker protests, and Mr. Prigozhin’s political operatives remain stationed in the capital to this day.”</i><br><br> This behaviour matches T0137: Make Money because analysts have asserted that the identified influence operation was in part motivated by a goal to generate profit |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000329 <I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona) |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,92 +21,14 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | IT00000335 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.101 Local Persona](../../generated_pages/techniques/T0097.101.md) | IT00000252 <i>“In addition to directly posting material on social media, we observed some personas in the network [of inauthentic accounts attributed to Iran] leverage legitimate print and online media outlets in the U.S. and Israel to promote Iranian interests via the submission of letters, guest columns, and blog posts that were then published. We also identified personas that we suspect were fabricated for the sole purpose of submitting such letters, but that do not appear to maintain accounts on social media. The personas claimed to be based in varying locations depending on the news outlets they were targeting for submission; for example, a persona that listed their location as Seattle, WA in a letter submitted to the Seattle Times subsequently claimed to be located in Baytown, TX in a letter submitted to The Baytown Sun. Other accounts in the network then posted links to some of these letters on social media.”</i><br><br> In this example actors fabricated individuals who lived in areas which were being targeted for influence through the use of letters to local papers (T0097.101: Local Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0097.102 Journalist Persona](../../generated_pages/techniques/T0097.102.md) | IT00000257 <i>“Accounts in the network [of inauthentic accounts attributed to Iran], under the guise of journalist personas, also solicited various individuals over Twitter for interviews and chats, including real journalists and politicians. The personas appear to have successfully conducted remote video and audio interviews with U.S. and UK-based individuals, including a prominent activist, a radio talk show host, and a former U.S. Government official, and subsequently posted the interviews on social media, showing only the individual being interviewed and not the interviewer. The interviewees expressed views that Iran would likely find favorable, discussing topics such as the February 2019 Warsaw summit, an attack on a military parade in the Iranian city of Ahvaz, and the killing of Jamal Khashoggi.<br><br> “The provenance of these interviews appear to have been misrepresented on at least one occasion, with one persona appearing to have falsely claimed to be operating on behalf of a mainstream news outlet; a remote video interview with a US-based activist about the Jamal Khashoggi killing was posted by an account adopting the persona of a journalist from the outlet Newsday, with the Newsday logo also appearing in the video. We did not identify any Newsday interview with the activist in question on this topic. In another instance, a persona posing as a journalist directed tweets containing audio of an interview conducted with a former U.S. Government official at real media personalities, calling on them to post about the interview.”</i><br><br> In this example actors fabricated journalists (T0097.102: Journalist Persona, T0143.002: Fabricated Persona) who worked at existing news outlets (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona) in order to conduct interviews with targeted individuals. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | IT00000334 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.202 News Outlet Persona](../../generated_pages/techniques/T0097.202.md) | IT00000255 <i>“Accounts in the network [of inauthentic accounts attributed to Iran], under the guise of journalist personas, also solicited various individuals over Twitter for interviews and chats, including real journalists and politicians. The personas appear to have successfully conducted remote video and audio interviews with U.S. and UK-based individuals, including a prominent activist, a radio talk show host, and a former U.S. Government official, and subsequently posted the interviews on social media, showing only the individual being interviewed and not the interviewer. The interviewees expressed views that Iran would likely find favorable, discussing topics such as the February 2019 Warsaw summit, an attack on a military parade in the Iranian city of Ahvaz, and the killing of Jamal Khashoggi.<br><br> “The provenance of these interviews appear to have been misrepresented on at least one occasion, with one persona appearing to have falsely claimed to be operating on behalf of a mainstream news outlet; a remote video interview with a US-based activist about the Jamal Khashoggi killing was posted by an account adopting the persona of a journalist from the outlet Newsday, with the Newsday logo also appearing in the video. We did not identify any Newsday interview with the activist in question on this topic. In another instance, a persona posing as a journalist directed tweets containing audio of an interview conducted with a former U.S. Government official at real media personalities, calling on them to post about the interview.”</i><br><br> In this example actors fabricated journalists (T0097.102: Journalist Persona, T0143.002: Fabricated Persona) who worked at existing news outlets (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona) in order to conduct interviews with targeted individuals. |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000256 <i>“Accounts in the network [of inauthentic accounts attributed to Iran], under the guise of journalist personas, also solicited various individuals over Twitter for interviews and chats, including real journalists and politicians. The personas appear to have successfully conducted remote video and audio interviews with U.S. and UK-based individuals, including a prominent activist, a radio talk show host, and a former U.S. Government official, and subsequently posted the interviews on social media, showing only the individual being interviewed and not the interviewer. The interviewees expressed views that Iran would likely find favorable, discussing topics such as the February 2019 Warsaw summit, an attack on a military parade in the Iranian city of Ahvaz, and the killing of Jamal Khashoggi.<br><br> “The provenance of these interviews appear to have been misrepresented on at least one occasion, with one persona appearing to have falsely claimed to be operating on behalf of a mainstream news outlet; a remote video interview with a US-based activist about the Jamal Khashoggi killing was posted by an account adopting the persona of a journalist from the outlet Newsday, with the Newsday logo also appearing in the video. We did not identify any Newsday interview with the activist in question on this topic. In another instance, a persona posing as a journalist directed tweets containing audio of an interview conducted with a former U.S. Government official at real media personalities, calling on them to post about the interview.”</i><br><br> In this example actors fabricated journalists (T0097.102: Journalist Persona, T0143.002: Fabricated Persona) who worked at existing news outlets (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona) in order to conduct interviews with targeted individuals. |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000253 <i>“In addition to directly posting material on social media, we observed some personas in the network [of inauthentic accounts attributed to Iran] leverage legitimate print and online media outlets in the U.S. and Israel to promote Iranian interests via the submission of letters, guest columns, and blog posts that were then published. We also identified personas that we suspect were fabricated for the sole purpose of submitting such letters, but that do not appear to maintain accounts on social media. The personas claimed to be based in varying locations depending on the news outlets they were targeting for submission; for example, a persona that listed their location as Seattle, WA in a letter submitted to the Seattle Times subsequently claimed to be located in Baytown, TX in a letter submitted to The Baytown Sun. Other accounts in the network then posted links to some of these letters on social media.”</i><br><br> In this example actors fabricated individuals who lived in areas which were being targeted for influence through the use of letters to local papers (T0097.101: Local Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000254 <i>“Accounts in the network [of inauthentic accounts attributed to Iran], under the guise of journalist personas, also solicited various individuals over Twitter for interviews and chats, including real journalists and politicians. The personas appear to have successfully conducted remote video and audio interviews with U.S. and UK-based individuals, including a prominent activist, a radio talk show host, and a former U.S. Government official, and subsequently posted the interviews on social media, showing only the individual being interviewed and not the interviewer. The interviewees expressed views that Iran would likely find favorable, discussing topics such as the February 2019 Warsaw summit, an attack on a military parade in the Iranian city of Ahvaz, and the killing of Jamal Khashoggi.<br><br> “The provenance of these interviews appear to have been misrepresented on at least one occasion, with one persona appearing to have falsely claimed to be operating on behalf of a mainstream news outlet; a remote video interview with a US-based activist about the Jamal Khashoggi killing was posted by an account adopting the persona of a journalist from the outlet Newsday, with the Newsday logo also appearing in the video. We did not identify any Newsday interview with the activist in question on this topic. In another instance, a persona posing as a journalist directed tweets containing audio of an interview conducted with a former U.S. Government official at real media personalities, calling on them to post about the interview.”</i><br><br> In this example actors fabricated journalists (T0097.102: Journalist Persona, T0143.002: Fabricated Persona) who worked at existing news outlets (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona) in order to conduct interviews with targeted individuals. |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000333 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000332 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -22,11 +22,8 @@
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0097.103 Activist Persona](../../generated_pages/techniques/T0097.103.md) | IT00000262 <i>“The Syria portion of the network [of inauthentic accounts attributed to Russia] included additional sockpuppet accounts. One of these claimed to be a gay rights defender in Syria. Several said they were Syrian journalists. Another account, @SophiaHammer3, said she was born in Syria but currently lives in London. “I’m fond of history and politics. I struggle for justice.” Twitter users had previously observed that Sophia was likely a sockpuppet.”</i><br><br> This behaviour matches T0097.103: Activist Persona because the account presents itself as defending a political cause - in this case gay rights.<br><br> Twitter’s technical indicators allowed their analysts to assert that these accounts were “reliably tied to Russian state actors”, meaning the presented personas were entirely fabricated (T0143.002: Fabricated Persona); these accounts are not legitimate gay rights defenders or journalists, they’re assets controlled by Russia publishing narratives beneficial to their agenda. |
|
||||
| [T0097.202 News Outlet Persona](../../generated_pages/techniques/T0097.202.md) | IT00000259 <i>“Approximately one-third of the suspended accounts [in the network of inauthentic accounts attributed to Russia] tweeted primarily about Syria, in English, Russian, and Arabic; many accounts tweeted in all three languages. The themes these accounts pushed will be familiar to anyone who has studied Russian overt or covert information operations about Syria: <br> <br>- Praising Russia’s role in Syria; claiming Russia was killing terrorists in Syria and highlighting Russia’s humanitarian aid <br>- Criticizing the role of the Turkey and the US in Syria; claiming the US killed civilians in Syria <br>- Criticizing the White Helmets, and claiming that they worked with Westerners to created scenes to make it look like the Syrian government used chemical weapons <br><br> “The two most prominent Syria accounts were @Syria_FreeNews and @PamSpenser. <br><br> “@Syria_FreeNews had 20,505 followers and was created on April 6, 2017. The account’s bio said “Exclusive information about Middle East and Northern Africa countries events. BreaKing news from the scene.””</i><br><br> This behaviour matches T0097.202: News Outlet Persona because the account @Syrira_FreeNews presented itself as a news outlet in its name, bio, and branding, across all websites on which the persona had been established (T0144.001: Persona Presented across Platforms). Twitter’s technical indicators allowed them to attribute the account “can be reliably tied to Russian state actors”. Because of this we can assert that the persona is entirely fabricated (T0143.002: Fabricated Persona); this is not a legitimate news outlet providing information about Syria, it’s an asset controlled by Russia publishing narratives beneficial to their agenda. |
|
||||
| [T0097.202 News Outlet Persona](../../generated_pages/techniques/T0097.202.md) | IT00000265 <i>“Two accounts [in the second network of accounts taken down by Twitter] appear to have been operated by Oriental Review and the Strategic Culture Foundation, respectively. Oriental Review bills itself as an “open source site for free thinking”, though it trades in outlandish conspiracy theories and posts content bylined by fake people. Stanford Internet Observatory researchers and investigative journalists have previously noted the presence of content bylined by fake “reporter” personas tied to the GRU-linked front Inside Syria Media Center, posted on Oriental Review.”</i><br><br> In an effort to make the Oriental Review’s stories appear more credible, the threat actors created fake journalists and pretended they wrote the articles on their website (aka “bylined” them).<br><br> In DISARM terms, they fabricated journalists (T0143.002: Fabricated Persona, T0097.003: Journalist Persona), and then used these fabricated journalists to increase perceived legitimacy (T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | IT00000258 <i>“The largest account [in the network of inauthentic accounts attributed to Russia] had 11,542 followers but only 8 had over 1,000 followers, and 11 had under ten. The accounts in aggregate had only 79,807 engagements across the entire tweet corpus, and appear to have been linked to the operations primarily via technical indicators rather than amplification or conversation between them. A few of the bios from accounts in the set claim to be journalists. Two profiles, belonging to an American activist and a Russian academic, were definitively real people; we do not have sufficient visibility into the technical indicators that led to their inclusion in the network and thus do not include them in our discussion.”</i><br><br> In this example the Stanford Internet Observatory has been provided data on two networks which, according to Twitter, showed signs of being affiliated with Russia’s Internet Research Agency (IRA). Two accounts investigated by Stanford were real people presenting their authentic personas, matching T0143.001: Authentic Persona.<br><br> Stanford didn’t have access to the technical indicators associating these accounts with the IRA, so they did not include data associated with these accounts for assessment. Analysts with access to platform logs may be able to uncover indicators of suspicious behaviour in accounts presenting authentic personas, using attribution methods unavailable to analysts working with open source data. |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000261 <i>“Approximately one-third of the suspended accounts [in the network of inauthentic accounts attributed to Russia] tweeted primarily about Syria, in English, Russian, and Arabic; many accounts tweeted in all three languages. The themes these accounts pushed will be familiar to anyone who has studied Russian overt or covert information operations about Syria: <br> <br>- Praising Russia’s role in Syria; claiming Russia was killing terrorists in Syria and highlighting Russia’s humanitarian aid <br>- Criticizing the role of the Turkey and the US in Syria; claiming the US killed civilians in Syria <br>- Criticizing the White Helmets, and claiming that they worked with Westerners to created scenes to make it look like the Syrian government used chemical weapons <br><br> “The two most prominent Syria accounts were @Syria_FreeNews and @PamSpenser. <br><br> “@Syria_FreeNews had 20,505 followers and was created on April 6, 2017. The account’s bio said “Exclusive information about Middle East and Northern Africa countries events. BreaKing news from the scene.””</i><br><br> This behaviour matches T0097.202: News Outlet Persona because the account @Syrira_FreeNews presented itself as a news outlet in its name, bio, and branding, across all websites on which the persona had been established (T0144.001: Persona Presented across Platforms). Twitter’s technical indicators allowed them to attribute the account “can be reliably tied to Russian state actors”. Because of this we can assert that the persona is entirely fabricated (T0143.002: Fabricated Persona); this is not a legitimate news outlet providing information about Syria, it’s an asset controlled by Russia publishing narratives beneficial to their agenda. |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000263 <i>“The Syria portion of the network [of inauthentic accounts attributed to Russia] included additional sockpuppet accounts. One of these claimed to be a gay rights defender in Syria. Several said they were Syrian journalists. Another account, @SophiaHammer3, said she was born in Syria but currently lives in London. “I’m fond of history and politics. I struggle for justice.” Twitter users had previously observed that Sophia was likely a sockpuppet.”</i><br><br> This behaviour matches T0097.103: Activist Persona because the account presents itself as defending a political cause - in this case gay rights.<br><br> Twitter’s technical indicators allowed their analysts to assert that these accounts were “reliably tied to Russian state actors”, meaning the presented personas were entirely fabricated (T0143.002: Fabricated Persona); these accounts are not legitimate gay rights defenders or journalists, they’re assets controlled by Russia publishing narratives beneficial to their agenda. |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000264 <i>“Two accounts [in the second network of accounts taken down by Twitter] appear to have been operated by Oriental Review and the Strategic Culture Foundation, respectively. Oriental Review bills itself as an “open source site for free thinking”, though it trades in outlandish conspiracy theories and posts content bylined by fake people. Stanford Internet Observatory researchers and investigative journalists have previously noted the presence of content bylined by fake “reporter” personas tied to the GRU-linked front Inside Syria Media Center, posted on Oriental Review.”</i><br><br> In an effort to make the Oriental Review’s stories appear more credible, the threat actors created fake journalists and pretended they wrote the articles on their website (aka “bylined” them).<br><br> In DISARM terms, they fabricated journalists (T0143.002: Fabricated Persona, T0097.003: Journalist Persona), and then used these fabricated journalists to increase perceived legitimacy (T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0144.001 Present Persona across Platforms](../../generated_pages/techniques/T0144.001.md) | IT00000260 <i>“Approximately one-third of the suspended accounts [in the network of inauthentic accounts attributed to Russia] tweeted primarily about Syria, in English, Russian, and Arabic; many accounts tweeted in all three languages. The themes these accounts pushed will be familiar to anyone who has studied Russian overt or covert information operations about Syria: <br> <br>- Praising Russia’s role in Syria; claiming Russia was killing terrorists in Syria and highlighting Russia’s humanitarian aid <br>- Criticizing the role of the Turkey and the US in Syria; claiming the US killed civilians in Syria <br>- Criticizing the White Helmets, and claiming that they worked with Westerners to created scenes to make it look like the Syrian government used chemical weapons <br><br> “The two most prominent Syria accounts were @Syria_FreeNews and @PamSpenser. <br><br> “@Syria_FreeNews had 20,505 followers and was created on April 6, 2017. The account’s bio said “Exclusive information about Middle East and Northern Africa countries events. BreaKing news from the scene.””</i><br><br> This behaviour matches T0097.202: News Outlet Persona because the account @Syrira_FreeNews presented itself as a news outlet in its name, bio, and branding, across all websites on which the persona had been established (T0144.001: Persona Presented across Platforms). Twitter’s technical indicators allowed them to attribute the account “can be reliably tied to Russian state actors”. Because of this we can assert that the persona is entirely fabricated (T0143.002: Fabricated Persona); this is not a legitimate news outlet providing information about Syria, it’s an asset controlled by Russia publishing narratives beneficial to their agenda. |
|
||||
|
||||
|
@ -25,6 +25,7 @@
|
||||
| [T0097.106 Recruiter Persona](../../generated_pages/techniques/T0097.106.md) | IT00000270 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0097.204 Think Tank Persona](../../generated_pages/techniques/T0097.204.md) | IT00000267 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | IT00000269 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000268 <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,7 +21,7 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0016 Create Clickbait](../../generated_pages/techniques/T0016.md) | IT00000275 <i>“On January 4 [2017], a little-known news site based in Donetsk, Ukraine published an article claiming that the United States was sending 3,600 tanks to Europe as part of “the NATO war preparation against Russia”.<br><br> “Like much fake news, this story started with a grain of truth: the US was about to reinforce its armored units in Europe. However, the article converted literally thousands of other vehicles — including hundreds of Humvees and trailers — into tanks, building the US force into something 20 times more powerful than it actually was.<br><br> “The story caught on online. Within three days it had been repeated by a dozen websites in the United States, Canada and Europe, and shared some 40,000 times. It was translated into Norwegian; quoted, unchallenged, by Russian state news agency RIA Novosti; and spread among Russian-language websites.<br><br> “It was also an obvious fake, as any Google news search would have revealed. Yet despite its evident falsehood, it spread widely, and not just in directly Kremlin-run media. Tracking the spread of this fake therefore shines a light on the wider question of how fake stories are dispersed.”</i><br><br> Russian state news agency RIA Novosti presents themselves as a news outlet (T0097.202: News Outlet Persona). RIO Novosti is a real news outlet (T0143.001: Authentic Persona), but it did not carry out a basic investigation into the veracity of the narrative they published implicitly expected of institutions presenting themselves as news outlets.<br><br> We can’t know how or why this narrative ended up being published by RIA Novosti, but we know that it presented a distorted reality as authentic information (T0023: Distort Facts), claiming that the US was sending 3,600 tanks, instead of 3,600 vehicles which included ~180 tanks. |
|
||||
| [T0016 Create Clickbait](../../generated_pages/techniques/T0016.md) | IT00000275 <i>“On January 4 [2017], however, the Donbas News International (DNI) agency, based in Donetsk, Ukraine, and (since September 2016) an official state media outlet of the unrecognized separatist Donetsk People’s Republic, ran an article under the sensational headline, “US sends 3,600 tanks against Russia — massive NATO deployment under way.” DNI is run by Finnish exile Janus Putkonen, described by the Finnish national broadcaster, YLE, as a “Finnish info warrior”, and the first foreigner to be granted a Donetsk passport.<br><br>“The equally sensational opening paragraph ran, “The NATO war preparation against Russia, ‘Operation Atlantic Resolve’, is in full swing. 2,000 US tanks will be sent in coming days from Germany to Eastern Europe, and 1,600 US tanks is deployed to storage facilities in the Netherlands. At the same time, NATO countries are sending thousands of soldiers in to Russian borders.”<br><br>“The report is based around an obvious factual error, conflating the total number of vehicles with the actual number of tanks, and therefore multiplying the actual tank force 20 times over. For context, military website globalfirepower.com puts the total US tank force at 8,848. If the DNI story had been true, it would have meant sending 40% of all the US’ main battle tanks to Europe in one go.<br><br>“Could this have been an innocent mistake? The simple answer is “no”. The journalist who penned the story had a sufficient command of the details to be able to write, later in the same article, “In January, 26 tanks, 100 other vehicles and 120 containers will be transported by train to Lithuania. Germany will send the 122nd Infantry Battalion.” Yet the same author apparently believed, in the headline and first paragraph, that every single vehicle in Atlantic Resolve is a tank. To call this an innocent mistake is simply not plausible.<br><br>“The DNI story can only realistically be considered a deliberate fake designed to caricaturize and demonize NATO, the United States and Germany (tactfully referred to in the report as having “rolled over Eastern Europe in its war of extermination 75 years ago”) by grossly overstating the number of MBTs involved.”</i><br><br>This behaviour matches T0016: Create Clickbait because the person who wrote the story is shown to be aware of the fact that there were non-tank vehicles later in their story, but still chose to give the article a sensationalist headline claiming that all vehicles being sent were tanks. |
|
||||
| [T0023 Distort Facts](../../generated_pages/techniques/T0023.md) | IT00000272 <i>“On January 4 [2017], a little-known news site based in Donetsk, Ukraine published an article claiming that the United States was sending 3,600 tanks to Europe as part of “the NATO war preparation against Russia”.<br><br> “Like much fake news, this story started with a grain of truth: the US was about to reinforce its armored units in Europe. However, the article converted literally thousands of other vehicles — including hundreds of Humvees and trailers — into tanks, building the US force into something 20 times more powerful than it actually was.<br><br> “The story caught on online. Within three days it had been repeated by a dozen websites in the United States, Canada and Europe, and shared some 40,000 times. It was translated into Norwegian; quoted, unchallenged, by Russian state news agency RIA Novosti; and spread among Russian-language websites.<br><br> “It was also an obvious fake, as any Google news search would have revealed. Yet despite its evident falsehood, it spread widely, and not just in directly Kremlin-run media. Tracking the spread of this fake therefore shines a light on the wider question of how fake stories are dispersed.”</i><br><br> Russian state news agency RIA Novosti presents themselves as a news outlet (T0097.202: News Outlet Persona). RIO Novosti is a real news outlet (T0143.001: Authentic Persona), but it did not carry out a basic investigation into the veracity of the narrative they published implicitly expected of institutions presenting themselves as news outlets.<br><br> We can’t know how or why this narrative ended up being published by RIA Novosti, but we know that it presented a distorted reality as authentic information (T0023: Distort Facts), claiming that the US was sending 3,600 tanks, instead of 3,600 vehicles which included ~180 tanks. |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | IT00000278 <i>“The sixth [website to repost a confirmed false narrative investigated in this report] is an apparent think tank, the Center for Global Strategic Monitoring. This website describes itself, in English apparently written by a non-native speaker, as a “nonprofit and nonpartisan research and analysis institution dedicated to providing insights of the think tank community publications”. It does, indeed, publish think-tank reports on issues such as Turkey and US-China relations; however, the reports are the work of other think tanks, often unattributed (the two mentioned in this sentence were actually produced by the Brookings Institution, although the website makes no mention of the fact). It also fails to provide an address, or any other contact details other than an email, and its (long) list of experts includes entries apparently copied and pasted from other institutions. Thus, the “think tank” website which shared the fake story appears to be a fake itself.”</i> In this example a website which amplified a false narrative presented itself as a think tank (T0097.204: Think Tank Persona).<br><br> This is an entirely fabricated persona (T0143.002: Fabricated Persona); it republished content from other think tanks without attribution (T0084.002: Plagiarise Content) and fabricated experts (T0097.108: Expert Persona, T0143.002: Fabricated Persona) to make it more believable that they were a real think tank. |
|
||||
| [T0097.108 Expert Persona](../../generated_pages/techniques/T0097.108.md) | IT00000277 <i>“The sixth [website to repost a confirmed false narrative investigated in this report] is an apparent think tank, the Center for Global Strategic Monitoring. This website describes itself, in English apparently written by a non-native speaker, as a “nonprofit and nonpartisan research and analysis institution dedicated to providing insights of the think tank community publications”. It does, indeed, publish think-tank reports on issues such as Turkey and US-China relations; however, the reports are the work of other think tanks, often unattributed (the two mentioned in this sentence were actually produced by the Brookings Institution, although the website makes no mention of the fact). It also fails to provide an address, or any other contact details other than an email, and its (long) list of experts includes entries apparently copied and pasted from other institutions. Thus, the “think tank” website which shared the fake story appears to be a fake itself.”</i> In this example a website which amplified a false narrative presented itself as a think tank (T0097.204: Think Tank Persona).<br><br> This is an entirely fabricated persona (T0143.002: Fabricated Persona); it republished content from other think tanks without attribution (T0084.002: Plagiarise Content) and fabricated experts (T0097.108: Expert Persona, T0143.002: Fabricated Persona) to make it more believable that they were a real think tank. |
|
||||
|
@ -25,11 +25,7 @@
|
||||
| [T0097.103 Activist Persona](../../generated_pages/techniques/T0097.103.md) | IT00000282 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000283 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000284 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0145.006 Attractive Person Account Imagery](../../generated_pages/techniques/T0145.006.md) | IT00000280 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br>
|
||||
|
||||
“Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</I><br><br>
|
||||
|
||||
This behaviour matches T0145.006: Stock Image Account Imagery because the account was identified as using a stock image as its profile picture. |
|
||||
| [T0145.007 Stock Image Account Imagery](../../generated_pages/techniques/T0145.007.md) | IT00000280 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br>“Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</I><br><br>This behaviour matches T0145.007: Stock Image Account Imagery because the account was identified as using a stock image as its profile picture. |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -22,13 +22,9 @@
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0097.102 Journalist Persona](../../generated_pages/techniques/T0097.102.md) | IT00000287 <I>“[Meta] removed 41 Facebook accounts, five Groups, and four Instagram accounts for violating our policy against coordinated inauthentic behavior. This activity originated in Belarus and primarily targeted audiences in the Middle East and Europe.<br><br> “The core of this activity began in October 2021, with some accounts created as recently as mid-November. The people behind it used newly-created fake accounts — many of which were detected and disabled by our automated systems soon after creation — to pose as journalists and activists from the European Union, particularly Poland and Lithuania. Some of the accounts used profile photos likely generated using artificial intelligence techniques like generative adversarial networks (GAN). These fictitious personas posted criticism of Poland in English, Polish, and Kurdish, including pictures and videos about Polish border guards allegedly violating migrants’ rights, and compared Poland’s treatment of migrants against other countries’. They also posted to Groups focused on the welfare of migrants in Europe. A few accounts posted in Russian about relations between Belarus and the Baltic States.”</i><br><br> This example shows how accounts identified as participating in coordinated inauthentic behaviour were presenting themselves as journalists and activists while spreading operation narratives (T0097.102: Journalist Persona, T0097.103: Activist Persona).<br><br> Additionally, analysts at Meta identified accounts which were participating in coordinated inauthentic behaviour that had likely used AI-Generated images as their profile pictures (T0145.002: AI-Generated Account Imagery). |
|
||||
| [T0097.103 Activist Persona](../../generated_pages/techniques/T0097.103.md) | IT00000288 <I>“[Meta] removed 41 Facebook accounts, five Groups, and four Instagram accounts for violating our policy against coordinated inauthentic behavior. This activity originated in Belarus and primarily targeted audiences in the Middle East and Europe.<br><br> “The core of this activity began in October 2021, with some accounts created as recently as mid-November. The people behind it used newly-created fake accounts — many of which were detected and disabled by our automated systems soon after creation — to pose as journalists and activists from the European Union, particularly Poland and Lithuania. Some of the accounts used profile photos likely generated using artificial intelligence techniques like generative adversarial networks (GAN). These fictitious personas posted criticism of Poland in English, Polish, and Kurdish, including pictures and videos about Polish border guards allegedly violating migrants’ rights, and compared Poland’s treatment of migrants against other countries’. They also posted to Groups focused on the welfare of migrants in Europe. A few accounts posted in Russian about relations between Belarus and the Baltic States.”</i><br><br> This example shows how accounts identified as participating in coordinated inauthentic behaviour were presenting themselves as journalists and activists while spreading operation narratives (T0097.102: Journalist Persona, T0097.103: Activist Persona).<br><br> Additionally, analysts at Meta identified accounts which were participating in coordinated inauthentic behaviour that had likely used AI-Generated images as their profile pictures (T0145.002: AI-Generated Account Imagery). |
|
||||
| [T0097.103 Activist Persona](../../generated_pages/techniques/T0097.103.md) | IT00000290 <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content)
|
||||
|
|
||||
| [T0124.001 Report Non-Violative Opposing Content](../../generated_pages/techniques/T0124.001.md) | IT00000291 <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content)
|
||||
|
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000292 <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content)
|
||||
|
|
||||
| [T0097.103 Activist Persona](../../generated_pages/techniques/T0097.103.md) | IT00000290 <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content). |
|
||||
| [T0124.001 Report Non-Violative Opposing Content](../../generated_pages/techniques/T0124.001.md) | IT00000291 <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000292 <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content). |
|
||||
| [T0145.002 AI-Generated Account Imagery](../../generated_pages/techniques/T0145.002.md) | IT00000289 <I>“[Meta] removed 41 Facebook accounts, five Groups, and four Instagram accounts for violating our policy against coordinated inauthentic behavior. This activity originated in Belarus and primarily targeted audiences in the Middle East and Europe.<br><br> “The core of this activity began in October 2021, with some accounts created as recently as mid-November. The people behind it used newly-created fake accounts — many of which were detected and disabled by our automated systems soon after creation — to pose as journalists and activists from the European Union, particularly Poland and Lithuania. Some of the accounts used profile photos likely generated using artificial intelligence techniques like generative adversarial networks (GAN). These fictitious personas posted criticism of Poland in English, Polish, and Kurdish, including pictures and videos about Polish border guards allegedly violating migrants’ rights, and compared Poland’s treatment of migrants against other countries’. They also posted to Groups focused on the welfare of migrants in Europe. A few accounts posted in Russian about relations between Belarus and the Baltic States.”</i><br><br> This example shows how accounts identified as participating in coordinated inauthentic behaviour were presenting themselves as journalists and activists while spreading operation narratives (T0097.102: Journalist Persona, T0097.103: Activist Persona).<br><br> Additionally, analysts at Meta identified accounts which were participating in coordinated inauthentic behaviour that had likely used AI-Generated images as their profile pictures (T0145.002: AI-Generated Account Imagery). |
|
||||
|
||||
|
||||
|
@ -22,67 +22,10 @@
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0043.001 Use Encrypted Chat Apps](../../generated_pages/techniques/T0043.001.md) | IT00000294 <I>“[Russia’s social media] reach isn't the same as Russian state media, but they are trying to recreate what RT and Sputnik had done," said one EU official involved in tracking Russian disinformation. "It's a coordinated effort that goes beyond social media and involves specific websites."<br><br> “Central to that wider online playbook is a Telegram channel called Warfakes and an affiliated website. Since the beginning of the conflict, that social media channel has garnered more than 725,000 members and repeatedly shares alleged fact-checks aimed at debunking Ukrainian narratives, using language similar to Western-style fact-checking outlets.”</i><br><br> In this example a Telegram channel (T0043.001: Use Encrypted Chat Apps) was established which presented itself as a source of fact checks (T0097.203: Fact Checking Organisation Persona). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | IT00000337 <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.203 Fact Checking Organisation Persona](../../generated_pages/techniques/T0097.203.md) | IT00000295 <I>“[Russia’s social media] reach isn't the same as Russian state media, but they are trying to recreate what RT and Sputnik had done," said one EU official involved in tracking Russian disinformation. "It's a coordinated effort that goes beyond social media and involves specific websites."<br><br> “Central to that wider online playbook is a Telegram channel called Warfakes and an affiliated website. Since the beginning of the conflict, that social media channel has garnered more than 725,000 members and repeatedly shares alleged fact-checks aimed at debunking Ukrainian narratives, using language similar to Western-style fact-checking outlets.”</i><br><br> In this example a Telegram channel (T0043.001: Use Encrypted Chat Apps) was established which presented itself as a source of fact checks (T0097.203: Fact Checking Organisation Persona). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | IT00000338 <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | IT00000336 <i>“After the European Union banned Kremlin-backed media outlets and social media giants demoted their posts for peddling falsehoods about the war in Ukraine, Moscow has turned to its cadre of diplomats, government spokespeople and ministers — many of whom have extensive followings on social media — to promote disinformation about the conflict in Eastern Europe, according to four EU and United States officials.”</i><br><br>In this example authentic Russian government officials used their own accounts to promote false narratives (T0143.001: Authentic Persona, T0097.111: Government Official Persona).<br><br>The use of accounts managed by authentic Government / Diplomats to spread false narratives makes it harder for platforms to enforce content moderation, because of the political ramifications they may face for censoring elected officials (T0131: Exploit TOS/Content Moderation). For example, Twitter previously argued that official channels of world leaders are not removed due to the high public interest associated with their activities. |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -15,73 +15,15 @@
|
||||
|
||||
| Reference | Pub Date | Authors | Org | Archive |
|
||||
| --------- | -------- | ------- | --- | ------- |
|
||||
| [https://www.svt.se/nyheter/inrikes/china-s-large-scale-media-push-attempts-to-influence-swedish-media](https://www.svt.se/nyheter/inrikes/china-s-large-scale-media-push-attempts-to-influence-swedish-media) | 2020/01/20 | Knut Kainz Rognerud, Karin Moberg, Jon Åhlén | SÅ ARBETAR VI
|
||||
| [https://web.archive.org/web/20240408034525/https://www.svt.se/nyheter/inrikes/china-s-large-scale-media-push-attempts-to-influence-swedish-media](https://web.archive.org/web/20240408034525/https://www.svt.se/nyheter/inrikes/china-s-large-scale-media-push-attempts-to-influence-swedish-media) |
|
||||
| [https://www.svt.se/nyheter/inrikes/china-s-large-scale-media-push-attempts-to-influence-swedish-media](https://www.svt.se/nyheter/inrikes/china-s-large-scale-media-push-attempts-to-influence-swedish-media) | 2020/01/20 | Knut Kainz Rognerud, Karin Moberg, Jon Åhlén | SÅ ARBETAR VI | [https://web.archive.org/web/20240408034525/https://www.svt.se/nyheter/inrikes/china-s-large-scale-media-push-attempts-to-influence-swedish-media](https://web.archive.org/web/20240408034525/https://www.svt.se/nyheter/inrikes/china-s-large-scale-media-push-attempts-to-influence-swedish-media) |
|
||||
|
||||
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | "<i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).”" |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | IT00000340 <i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).” |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | IT00000341 <i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).” |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | IT00000339 <i>“Four media companies – Svenska Dagbladet, Expressen, Sveriges Radio, and Sveriges Television – stated that they had been contacted by the Chinese embassy on several occasions, and that they, for instance, had been criticized on their publications, both by letters and e-mails.<br><br> The media company Svenska Dagbladet, had been contacted on several occasions in the past two years, including via e-mails directly from the Chinese ambassador to Sweden. Several times, China and the Chinese ambassador had criticized the media company’s publications regarding the conditions in China. Individual reporters also reported having been subjected to criticism.<br><br> The tabloid Expressen had received several letters and e-mails from the embassy, e-mails containing criticism and threatening formulations regarding the coverage of the Swedish book publisher Gui Minhai, who has been imprisoned in China since 2015. Formulations such as “media tyranny” could be found in the e-mails.”</i><br><br> In this case, the Chinese ambassador is using their official role (T0143.001: Authentic Persona, T0097.111: Government Official Persona) to try to influence Swedish press. A government official trying to interfere in other countries' media activities could be a violation of press freedom. In this specific case, the Chinese diplomats are trying to silence criticism against China (T0139.002: Silence).” |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -23,11 +23,11 @@
|
||||
| --------- | ------------------------- |
|
||||
| [T0015 Create Hashtags and Search Artefacts](../../generated_pages/techniques/T0015.md) | IT00000302 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”, which posted hashtags alongside campaign content (T0015: Create Hashtags and Search Artefacts):<br><br><i>“The accounts post generic images to fill their account feed to make the account seem real. They then employ a hidden hashtag in their posts, consisting of a seemingly random string of numbers and letters.<br><br>“The hypothesis regarding this tactic is that the group orchestrating these accounts utilizes these hashtags as a means of indexing them. This system likely serves a dual purpose: firstly, to keep track of the network’s expansive network of accounts and unique posts, and secondly, to streamline the process of boosting engagement among these accounts. By searching for these specific, unique hashtags, the group can quickly locate posts from their network and engage with them using other fake accounts, thereby artificially inflating the visibility and perceived authenticity of the fake account.”</i> |
|
||||
| [T0085.008 Machine Translated Text](../../generated_pages/techniques/T0085.008.md) | IT00000301 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br><i>“A conspicuous aspect of these accounts is the likely usage of machine-translated Hebrew. The disjointed and linguistically strange comments imply that the CIB’s architects are not Hebrew-speaking and likely translate to Hebrew using online tools. There’s no official way to confirm that a text is translated, but it is evident when the gender for nouns is incorrect, very unusual words or illogical grammar being used usually lead to the conclusion that the comment was not written by a native speaker that is aware of the nuances of the language.”</i><br><br>In this example analysts asserted that accounts were posting content which had been translated via machine (T0085.008: Machine Translated Text), based on indicators such as issues with grammar and gender. |
|
||||
| [T0097.101 Local Persona](../../generated_pages/techniques/T0097.101.md) | IT00000296 Accounts which were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023” were presenting themselves as locals to Israel (T0097.101: Local Persona):<br><br><i>“Unlike usual low-effort fake accounts, these accounts meticulously mimic young Israelis. They stand out due to the extraordinary lengths taken to ensure their authenticity, from unique narratives to the content they produce to their seemingly authentic interactions.”<I>
|
||||
|
|
||||
| [T0097.101 Local Persona](../../generated_pages/techniques/T0097.101.md) | IT00000296 Accounts which were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023” were presenting themselves as locals to Israel (T0097.101: Local Persona):<br><br><i>“Unlike usual low-effort fake accounts, these accounts meticulously mimic young Israelis. They stand out due to the extraordinary lengths taken to ensure their authenticity, from unique narratives to the content they produce to their seemingly authentic interactions.”<I> |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000297 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000300 <i>“In the wake of the Hamas attack on October 7th, the Israel Defense Forces (IDF) Information Security Department revealed a campaign of Instagram accounts impersonating young, attractive Israeli women who were actively engaging Israeli soldiers, attempting to extract information through direct messages.<br><br> [...]<br><br> “Some profiles underwent a reverse-image search of their photos to ascertain their authenticity. Many of the images searched were found to be appropriated from genuine social media profiles or sites such as Pinterest. When this was the case, the account was marked as confirmed to be inauthentic. One innovative method involves using photos that are initially frames from videos, which allows for evading reverse searches in most cases . This is seen in Figure 4, where an image uploaded by an inauthentic account was a screenshot taken from a TikTok video.”</i><br><br> In this example accounts associated with an influence operation used account imagery showing <i>“young, attractive Israeli women”</i> (T0145.007: Attractive Person Account Imagery), with some of these assets taken from existing accounts not associated with the operation (T0145.001: Copy Account Imagery). |
|
||||
| [T0145.007 Stock Image Account Imagery](../../generated_pages/techniques/T0145.007.md) | IT00000299 <i>“In the wake of the Hamas attack on October 7th, the Israel Defense Forces (IDF) Information Security Department revealed a campaign of Instagram accounts impersonating young, attractive Israeli women who were actively engaging Israeli soldiers, attempting to extract information through direct messages.<br><br> [...]<br><br> “Some profiles underwent a reverse-image search of their photos to ascertain their authenticity. Many of the images searched were found to be appropriated from genuine social media profiles or sites such as Pinterest. When this was the case, the account was marked as confirmed to be inauthentic. One innovative method involves using photos that are initially frames from videos, which allows for evading reverse searches in most cases . This is seen in Figure 4, where an image uploaded by an inauthentic account was a screenshot taken from a TikTok video.”</i><br><br> In this example accounts associated with an influence operation used account imagery showing <i>“young, attractive Israeli women”</i> (T0145.007: Attractive Person Account Imagery), with some of these assets taken from existing accounts not associated with the operation (T0145.001: Copy Account Imagery). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000298 In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000300 <i>“In the wake of the Hamas attack on October 7th, the Israel Defense Forces (IDF) Information Security Department revealed a campaign of Instagram accounts impersonating young, attractive Israeli women who were actively engaging Israeli soldiers, attempting to extract information through direct messages.<br><br> [...]<br><br> “Some profiles underwent a reverse-image search of their photos to ascertain their authenticity. Many of the images searched were found to be appropriated from genuine social media profiles or sites such as Pinterest. When this was the case, the account was marked as confirmed to be inauthentic. One innovative method involves using photos that are initially frames from videos, which allows for evading reverse searches in most cases . This is seen in Figure 4, where an image uploaded by an inauthentic account was a screenshot taken from a TikTok video.”</i><br><br> In this example accounts associated with an influence operation used account imagery showing <i>“young, attractive Israeli women”</i> (T0145.006: Attractive Person Account Imagery), with some of these assets taken from existing accounts not associated with the operation (T0145.001: Copy Account Imagery). |
|
||||
| [T0145.006 Attractive Person Account Imagery](../../generated_pages/techniques/T0145.006.md) | IT00000299 <i>“In the wake of the Hamas attack on October 7th, the Israel Defense Forces (IDF) Information Security Department revealed a campaign of Instagram accounts impersonating young, attractive Israeli women who were actively engaging Israeli soldiers, attempting to extract information through direct messages.<br><br> [...]<br><br> “Some profiles underwent a reverse-image search of their photos to ascertain their authenticity. Many of the images searched were found to be appropriated from genuine social media profiles or sites such as Pinterest. When this was the case, the account was marked as confirmed to be inauthentic. One innovative method involves using photos that are initially frames from videos, which allows for evading reverse searches in most cases . This is seen in Figure 4, where an image uploaded by an inauthentic account was a screenshot taken from a TikTok video.”</i><br><br> In this example accounts associated with an influence operation used account imagery showing <i>“young, attractive Israeli women”</i> (T0145.006: Attractive Person Account Imagery), with some of these assets taken from existing accounts not associated with the operation (T0145.001: Copy Account Imagery). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -23,9 +23,8 @@
|
||||
| --------- | ------------------------- |
|
||||
| [T0097.101 Local Persona](../../generated_pages/techniques/T0097.101.md) | IT00000307 <i>“Another actor operating in China is the American-based company Devumi. Most of the Twitter accounts managed by Devumi resemble real people, and some are even associated with a kind of large-scale social identity theft. At least 55,000 of the accounts use the names, profile pictures, hometowns and other personal details of real Twitter users, including minors, according to The New York Times (Confessore et al., 2018)).”</i><br><br>In this example accounts impersonated real locals while spreading operation narratives (T0143.003: Impersonated Persona, T0097.101: Local Persona). The impersonation included stealing the legitimate accounts’ profile pictures (T0145.001: Copy Account Imagery). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000306 <i>“Another actor operating in China is the American-based company Devumi. Most of the Twitter accounts managed by Devumi resemble real people, and some are even associated with a kind of large-scale social identity theft. At least 55,000 of the accounts use the names, profile pictures, hometowns and other personal details of real Twitter users, including minors, according to The New York Times (Confessore et al., 2018)).”</i><br><br>In this example accounts impersonated real locals while spreading operation narratives (T0143.003: Impersonated Persona, T0097.101: Local Persona). The impersonation included stealing the legitimate accounts’ profile pictures (T0145.001: Copy Account Imagery). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000303 <i>“In 2017, Tanya O'Carroll, a technology and human rights adviser for Amnesty International, published an investigation of the political impact of bots and trolls in Mexico (O’Carroll, 2017). An article by the BBC describes a video showing the operation of a "troll farm" in Mexico, where people were tweeting in support of Enrique Peña Nieto of the PRI in 2012 (Martinez, 2018).<br><br>“According to a report published by El País, the main target of parties’ online strategies are young people, including 14 million new voters who are expected to play a decisive role in the outcome of the July 2018 election (Peinado et al., 2018). Thus, one of the strategies employed by these bots was the use of profile photos of attractive people from other countries (Soloff, 2017).”</i><br><br>In this example accounts copied the profile pictures of attractive people from other countries (T0145.001: Copy Account Imagery, T0145.007: Attractive Person Account Imagery). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000305 <i>“Another actor operating in China is the American-based company Devumi. Most of the Twitter accounts managed by Devumi resemble real people, and some are even associated with a kind of large-scale social identity theft. At least 55,000 of the accounts use the names, profile pictures, hometowns and other personal details of real Twitter users, including minors, according to The New York Times (Confessore et al., 2018)).”</i><br><br>In this example accounts impersonated real locals while spreading operation narratives (T0143.003: Impersonated Persona, T0097.101: Local Persona). The impersonation included stealing the legitimate accounts’ profile pictures (T0145.001: Copy Account Imagery). |
|
||||
| [T0145.007 Stock Image Account Imagery](../../generated_pages/techniques/T0145.007.md) | IT00000304 <i>“In 2017, Tanya O'Carroll, a technology and human rights adviser for Amnesty International, published an investigation of the political impact of bots and trolls in Mexico (O’Carroll, 2017). An article by the BBC describes a video showing the operation of a "troll farm" in Mexico, where people were tweeting in support of Enrique Peña Nieto of the PRI in 2012 (Martinez, 2018).<br><br>“According to a report published by El País, the main target of parties’ online strategies are young people, including 14 million new voters who are expected to play a decisive role in the outcome of the July 2018 election (Peinado et al., 2018). Thus, one of the strategies employed by these bots was the use of profile photos of attractive people from other countries (Soloff, 2017).”</i><br><br>In this example accounts copied the profile pictures of attractive people from other countries (T0145.001: Copy Account Imagery, T0145.007: Attractive Person Account Imagery). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000303 <i>“In 2017, Tanya O'Carroll, a technology and human rights adviser for Amnesty International, published an investigation of the political impact of bots and trolls in Mexico (O’Carroll, 2017). An article by the BBC describes a video showing the operation of a "troll farm" in Mexico, where people were tweeting in support of Enrique Peña Nieto of the PRI in 2012 (Martinez, 2018).<br><br>“According to a report published by El País, the main target of parties’ online strategies are young people, including 14 million new voters who are expected to play a decisive role in the outcome of the July 2018 election (Peinado et al., 2018). Thus, one of the strategies employed by these bots was the use of profile photos of attractive people from other countries (Soloff, 2017).”</i><br><br>In this example accounts copied the profile pictures of attractive people from other countries (T0145.001: Copy Account Imagery, T0145.006: Attractive Person Account Imagery). |
|
||||
| [T0145.006 Attractive Person Account Imagery](../../generated_pages/techniques/T0145.006.md) | IT00000304 <i>“In 2017, Tanya O'Carroll, a technology and human rights adviser for Amnesty International, published an investigation of the political impact of bots and trolls in Mexico (O’Carroll, 2017). An article by the BBC describes a video showing the operation of a "troll farm" in Mexico, where people were tweeting in support of Enrique Peña Nieto of the PRI in 2012 (Martinez, 2018).<br><br>“According to a report published by El País, the main target of parties’ online strategies are young people, including 14 million new voters who are expected to play a decisive role in the outcome of the July 2018 election (Peinado et al., 2018). Thus, one of the strategies employed by these bots was the use of profile photos of attractive people from other countries (Soloff, 2017).”</i><br><br>In this example accounts copied the profile pictures of attractive people from other countries (T0145.001: Copy Account Imagery, T0145.006: Attractive Person Account Imagery). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,10 +21,10 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0085.008 Machine Translated Text](../../generated_pages/techniques/T0085.008.md) | IT00000310 <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns. [Spamouflage is a coordinated inauthentic behaviour network attributed to the Chinese state.]<br><br> “Like other components of Spamouflage, the WoS network sometimes intersperses apolitical content with its more agenda-driven material. Many members post nearly identical comments at almost the same time. The text includes markers of automatic translation while error messages included as profile photos indicate the automated pulling of stock images.”</i><br><br> In this example analysts found an indicator of automated use of stock images in Facebook accounts; some of the accounts in the network appeared to have mistakenly uploaded error messages as profile pictures (T0145.006: Stock Image Account Imagery). The text posted by the accounts also appeared to have been translated using automation (T0085.008: Machine Translated Text). |
|
||||
| [T0085.008 Machine Translated Text](../../generated_pages/techniques/T0085.008.md) | IT00000310 <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns. [Spamouflage is a coordinated inauthentic behaviour network attributed to the Chinese state.]<br><br> “Like other components of Spamouflage, the WoS network sometimes intersperses apolitical content with its more agenda-driven material. Many members post nearly identical comments at almost the same time. The text includes markers of automatic translation while error messages included as profile photos indicate the automated pulling of stock images.”</i><br><br> In this example analysts found an indicator of automated use of stock images in Facebook accounts; some of the accounts in the network appeared to have mistakenly uploaded error messages as profile pictures (T0145.007: Stock Image Account Imagery). The text posted by the accounts also appeared to have been translated using automation (T0085.008: Machine Translated Text). |
|
||||
| [T0145.002 AI-Generated Account Imagery](../../generated_pages/techniques/T0145.002.md) | IT00000309 <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns.<br><br> “Spamouflage is a coordinated inatuhentic behaviour network attributed to the Chinese state.<br><br> “Despite the WoS network’s relative sophistication, there are tell-tale signs that it is an influence operation. Several user profile photos display signs of AI generation or do not match the profile’s listed gender.”</i><br><br> A network of accounts connected to the facebook page “The War of Somethings” used AI-generated images of people as their profile picture (T0145.002: AI-Generated Account Imagery). |
|
||||
| [T0145.003 Animal Account Imagery](../../generated_pages/techniques/T0145.003.md) | IT00000308 "<i>“Beneath a video on Facebook about the war between Israel and Hamas, Lamonica Trout commented, “America is the war monger, the Jew’s own son!” She left identical comments beneath the same video on two other Facebook pages. Trout’s profile provides no information besides her name. It lists no friends, and there is not a single post or photograph in her feed. Trout’s profile photo shows an alligator.<br><br> “Lamonica Trout is likely an invention of the group behind Spamouflage, an ongoing, multi-year influence operation that promotes Beijing’s interests. Last year, Facebook’s parent company, Meta, took down 7,704 accounts and 954 pages it identified as part of the Spamouflage operation, which it described as the “largest known cross-platform influence operation [Meta had] disrupted to date.”2 Facebook’s terms of service prohibit a range of deceptive and inauthentic behaviors, including efforts to conceal the purpose of social media activity or the identity of those behind it.”</i><br><br> In this example an account attributed to a multi-year influence operation created the persona of Lamonica Trout in a Facebook account, which used an image of an animal in its profile picture (T0145.003: Animal Account Imagery)." |
|
||||
| [T0145.006 Attractive Person Account Imagery](../../generated_pages/techniques/T0145.006.md) | IT00000311 <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns. [Spamouflage is a coordinated inauthentic behaviour network attributed to the Chinese state.]<br><br> “Like other components of Spamouflage, the WoS network sometimes intersperses apolitical content with its more agenda-driven material. Many members post nearly identical comments at almost the same time. The text includes markers of automatic translation while error messages included as profile photos indicate the automated pulling of stock images.”</i><br><br> In this example analysts found an indicator of automated use of stock images in Facebook accounts; some of the accounts in the network appeared to have mistakenly uploaded error messages as profile pictures (T0145.006: Stock Image Account Imagery). The text posted by the accounts also appeared to have been translated using automation (T0085.008: Machine Translated Text). |
|
||||
| [T0145.003 Animal Account Imagery](../../generated_pages/techniques/T0145.003.md) | IT00000308 <i>“Beneath a video on Facebook about the war between Israel and Hamas, Lamonica Trout commented, “America is the war monger, the Jew’s own son!” She left identical comments beneath the same video on two other Facebook pages. Trout’s profile provides no information besides her name. It lists no friends, and there is not a single post or photograph in her feed. Trout’s profile photo shows an alligator.<br><br> “Lamonica Trout is likely an invention of the group behind Spamouflage, an ongoing, multi-year influence operation that promotes Beijing’s interests. Last year, Facebook’s parent company, Meta, took down 7,704 accounts and 954 pages it identified as part of the Spamouflage operation, which it described as the “largest known cross-platform influence operation [Meta had] disrupted to date.”2 Facebook’s terms of service prohibit a range of deceptive and inauthentic behaviors, including efforts to conceal the purpose of social media activity or the identity of those behind it.”</i><br><br> In this example an account attributed to a multi-year influence operation created the persona of Lamonica Trout in a Facebook account, which used an image of an animal in its profile picture (T0145.003: Animal Account Imagery). |
|
||||
| [T0145.007 Stock Image Account Imagery](../../generated_pages/techniques/T0145.007.md) | IT00000311 <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns. [Spamouflage is a coordinated inauthentic behaviour network attributed to the Chinese state.]<br><br> “Like other components of Spamouflage, the WoS network sometimes intersperses apolitical content with its more agenda-driven material. Many members post nearly identical comments at almost the same time. The text includes markers of automatic translation while error messages included as profile photos indicate the automated pulling of stock images.”</i><br><br> In this example analysts found an indicator of automated use of stock images in Facebook accounts; some of the accounts in the network appeared to have mistakenly uploaded error messages as profile pictures (T0145.007: Stock Image Account Imagery). The text posted by the accounts also appeared to have been translated using automation (T0085.008: Machine Translated Text). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,8 +21,8 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000314 <I>“On Facebook, Rita, Alona and Christina appeared to be just like the millions of other U.S citizens sharing their lives with the world. They discussed family outings, shared emojis and commented on each other's photographs.<br><br> “In reality, the three accounts were part of a highly-targeted cybercrime operation, used to spread malware that was able to steal passwords and spy on victims.<br><br> “Hackers with links to Lebanon likely ran the covert scheme using a strain of malware dubbed "Tempting Cedar Spyware," according to researchers from Prague-based anti-virus company Avast, which detailed its findings in a report released on Wednesday.<br><br> “In a honey trap tactic as old as time, the culprits' targets were mostly male, and lured by fake attractive women. <br><br> “In the attack, hackers would send flirtatious messages using Facebook to the chosen victims, encouraging them to download a second , booby-trapped, chat application known as Kik Messenger to have "more secure" conversations. Upon analysis, Avast experts found that "many fell for the trap.””</i><br><br> In this example threat actors took on the persona of a romantic suitor on Facebook, directing their targets to another platform (T0097:109 Romantic Suitor Persona, T0145.007: Attractive Person Account Imagery, T0143.002: Fabricated Persona). |
|
||||
| [T0145.007 Stock Image Account Imagery](../../generated_pages/techniques/T0145.007.md) | IT00000312 <I>“On Facebook, Rita, Alona and Christina appeared to be just like the millions of other U.S citizens sharing their lives with the world. They discussed family outings, shared emojis and commented on each other's photographs.<br><br> “In reality, the three accounts were part of a highly-targeted cybercrime operation, used to spread malware that was able to steal passwords and spy on victims.<br><br> “Hackers with links to Lebanon likely ran the covert scheme using a strain of malware dubbed "Tempting Cedar Spyware," according to researchers from Prague-based anti-virus company Avast, which detailed its findings in a report released on Wednesday.<br><br> “In a honey trap tactic as old as time, the culprits' targets were mostly male, and lured by fake attractive women. <br><br> “In the attack, hackers would send flirtatious messages using Facebook to the chosen victims, encouraging them to download a second , booby-trapped, chat application known as Kik Messenger to have "more secure" conversations. Upon analysis, Avast experts found that "many fell for the trap.””</i><br><br> In this example threat actors took on the persona of a romantic suitor on Facebook, directing their targets to another platform (T0097:109 Romantic Suitor Persona, T0145.007: Attractive Person Account Imagery, T0143.002: Fabricated Persona). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000314 <I>“On Facebook, Rita, Alona and Christina appeared to be just like the millions of other U.S citizens sharing their lives with the world. They discussed family outings, shared emojis and commented on each other's photographs.<br><br> “In reality, the three accounts were part of a highly-targeted cybercrime operation, used to spread malware that was able to steal passwords and spy on victims.<br><br> “Hackers with links to Lebanon likely ran the covert scheme using a strain of malware dubbed "Tempting Cedar Spyware," according to researchers from Prague-based anti-virus company Avast, which detailed its findings in a report released on Wednesday.<br><br> “In a honey trap tactic as old as time, the culprits' targets were mostly male, and lured by fake attractive women. <br><br> “In the attack, hackers would send flirtatious messages using Facebook to the chosen victims, encouraging them to download a second , booby-trapped, chat application known as Kik Messenger to have "more secure" conversations. Upon analysis, Avast experts found that "many fell for the trap.””</i><br><br> In this example threat actors took on the persona of a romantic suitor on Facebook, directing their targets to another platform (T0097:109 Romantic Suitor Persona, T0145.006: Attractive Person Account Imagery, T0143.002: Fabricated Persona). |
|
||||
| [T0145.006 Attractive Person Account Imagery](../../generated_pages/techniques/T0145.006.md) | IT00000312 <I>“On Facebook, Rita, Alona and Christina appeared to be just like the millions of other U.S citizens sharing their lives with the world. They discussed family outings, shared emojis and commented on each other's photographs.<br><br> “In reality, the three accounts were part of a highly-targeted cybercrime operation, used to spread malware that was able to steal passwords and spy on victims.<br><br> “Hackers with links to Lebanon likely ran the covert scheme using a strain of malware dubbed "Tempting Cedar Spyware," according to researchers from Prague-based anti-virus company Avast, which detailed its findings in a report released on Wednesday.<br><br> “In a honey trap tactic as old as time, the culprits' targets were mostly male, and lured by fake attractive women. <br><br> “In the attack, hackers would send flirtatious messages using Facebook to the chosen victims, encouraging them to download a second , booby-trapped, chat application known as Kik Messenger to have "more secure" conversations. Upon analysis, Avast experts found that "many fell for the trap.””</i><br><br> In this example threat actors took on the persona of a romantic suitor on Facebook, directing their targets to another platform (T0097:109 Romantic Suitor Persona, T0145.006: Attractive Person Account Imagery, T0143.002: Fabricated Persona). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,66 +21,9 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0085.004 Develop Document](../../generated_pages/techniques/T0085.004.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.206 Government Institution Persona](../../generated_pages/techniques/T0097.206.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0137 Make Money](../../generated_pages/techniques/T0137.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0139.002 Silence](../../generated_pages/techniques/T0139.002.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0097.111 Government Official Persona](../../generated_pages/techniques/T0097.111.md) | IT00000343 <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0129.006 Deny Involvement](../../generated_pages/techniques/T0129.006.md) | IT00000344 <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | IT00000342 <i>“On October 23, Canada’s Foreign Ministry said it had discovered a disinformation campaign, likely tied to China, aimed at discrediting dozens of Canadian politicians, including Prime Minister Justin Trudeau.<br><br> “The ministry said the campaign took place in August and September. It used new and hijacked social media accounts to bulk-post messages targeting Canadian politicians (T0141.001: Acquire Compromised Account).<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> ““Canada was a downright liar and disseminator of false information… Beijing has never meddled in another nation’s domestic affairs.”<br><br> “A Chinese Embassy in Canada spokesperson dismissed Canada’s accusation as baseless.<br><br> “That is false.<br><br> “The Canadian government's report is based on an investigation conducted by its Rapid Response Mechanism cyber intelligence unit in cooperation with the social media platforms.<br><br> “The investigation exposed China’s disinformation campaign dubbed “Spamouflage” -- for its tactic of using “a network of new or hijacked social media accounts that posts and increases the number of propaganda messages across multiple social media platforms – including Facebook, X/Twitter, Instagram, YouTube, Medium, Reddit, TikTok, and LinkedIn.””</i><br><br> In this case a network of accounts attributed to China were identified operating on multiple platforms. The report was dismissed as false information by an official in the Chinese Embassy in Canada (T0143.001: Authentic Persona, T0097.111: Government Official Persona, T0129.006: Deny Involvement). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -8,7 +8,7 @@
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00017 US presidential elections](../../generated_pages/incidents/I00017.md) | Click-bait (economic actors) fake news sites (ie: Denver Guardian; Macedonian teens) |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“On January 4 [2017], a little-known news site based in Donetsk, Ukraine published an article claiming that the United States was sending 3,600 tanks to Europe as part of “the NATO war preparation against Russia”.<br><br> “Like much fake news, this story started with a grain of truth: the US was about to reinforce its armored units in Europe. However, the article converted literally thousands of other vehicles — including hundreds of Humvees and trailers — into tanks, building the US force into something 20 times more powerful than it actually was.<br><br> “The story caught on online. Within three days it had been repeated by a dozen websites in the United States, Canada and Europe, and shared some 40,000 times. It was translated into Norwegian; quoted, unchallenged, by Russian state news agency RIA Novosti; and spread among Russian-language websites.<br><br> “It was also an obvious fake, as any Google news search would have revealed. Yet despite its evident falsehood, it spread widely, and not just in directly Kremlin-run media. Tracking the spread of this fake therefore shines a light on the wider question of how fake stories are dispersed.”</i><br><br> Russian state news agency RIA Novosti presents themselves as a news outlet (T0097.202: News Outlet Persona). RIO Novosti is a real news outlet (T0143.001: Authentic Persona), but it did not carry out a basic investigation into the veracity of the narrative they published implicitly expected of institutions presenting themselves as news outlets.<br><br> We can’t know how or why this narrative ended up being published by RIA Novosti, but we know that it presented a distorted reality as authentic information (T0023: Distort Facts), claiming that the US was sending 3,600 tanks, instead of 3,600 vehicles which included ~180 tanks. |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“On January 4 [2017], however, the Donbas News International (DNI) agency, based in Donetsk, Ukraine, and (since September 2016) an official state media outlet of the unrecognized separatist Donetsk People’s Republic, ran an article under the sensational headline, “US sends 3,600 tanks against Russia — massive NATO deployment under way.” DNI is run by Finnish exile Janus Putkonen, described by the Finnish national broadcaster, YLE, as a “Finnish info warrior”, and the first foreigner to be granted a Donetsk passport.<br><br>“The equally sensational opening paragraph ran, “The NATO war preparation against Russia, ‘Operation Atlantic Resolve’, is in full swing. 2,000 US tanks will be sent in coming days from Germany to Eastern Europe, and 1,600 US tanks is deployed to storage facilities in the Netherlands. At the same time, NATO countries are sending thousands of soldiers in to Russian borders.”<br><br>“The report is based around an obvious factual error, conflating the total number of vehicles with the actual number of tanks, and therefore multiplying the actual tank force 20 times over. For context, military website globalfirepower.com puts the total US tank force at 8,848. If the DNI story had been true, it would have meant sending 40% of all the US’ main battle tanks to Europe in one go.<br><br>“Could this have been an innocent mistake? The simple answer is “no”. The journalist who penned the story had a sufficient command of the details to be able to write, later in the same article, “In January, 26 tanks, 100 other vehicles and 120 containers will be transported by train to Lithuania. Germany will send the 122nd Infantry Battalion.” Yet the same author apparently believed, in the headline and first paragraph, that every single vehicle in Atlantic Resolve is a tank. To call this an innocent mistake is simply not plausible.<br><br>“The DNI story can only realistically be considered a deliberate fake designed to caricaturize and demonize NATO, the United States and Germany (tactfully referred to in the report as having “rolled over Eastern Europe in its war of extermination 75 years ago”) by grossly overstating the number of MBTs involved.”</i><br><br>This behaviour matches T0016: Create Clickbait because the person who wrote the story is shown to be aware of the fact that there were non-tank vehicles later in their story, but still chose to give the article a sensationalist headline claiming that all vehicles being sent were tanks. |
|
||||
|
||||
|
||||
|
||||
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -8,7 +8,7 @@
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br><i>“A conspicuous aspect of these accounts is the likely usage of machine-translated Hebrew. The disjointed and linguistically strange comments imply that the CIB’s architects are not Hebrew-speaking and likely translate to Hebrew using online tools. There’s no official way to confirm that a text is translated, but it is evident when the gender for nouns is incorrect, very unusual words or illogical grammar being used usually lead to the conclusion that the comment was not written by a native speaker that is aware of the nuances of the language.”</i><br><br>In this example analysts asserted that accounts were posting content which had been translated via machine (T0085.008: Machine Translated Text), based on indicators such as issues with grammar and gender. |
|
||||
| [I00088 Much Ado About ‘Somethings’ - China-Linked Influence Operation Endures Despite Takedown](../../generated_pages/incidents/I00088.md) | <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns. [Spamouflage is a coordinated inauthentic behaviour network attributed to the Chinese state.]<br><br> “Like other components of Spamouflage, the WoS network sometimes intersperses apolitical content with its more agenda-driven material. Many members post nearly identical comments at almost the same time. The text includes markers of automatic translation while error messages included as profile photos indicate the automated pulling of stock images.”</i><br><br> In this example analysts found an indicator of automated use of stock images in Facebook accounts; some of the accounts in the network appeared to have mistakenly uploaded error messages as profile pictures (T0145.006: Stock Image Account Imagery). The text posted by the accounts also appeared to have been translated using automation (T0085.008: Machine Translated Text). |
|
||||
| [I00088 Much Ado About ‘Somethings’ - China-Linked Influence Operation Endures Despite Takedown](../../generated_pages/incidents/I00088.md) | <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns. [Spamouflage is a coordinated inauthentic behaviour network attributed to the Chinese state.]<br><br> “Like other components of Spamouflage, the WoS network sometimes intersperses apolitical content with its more agenda-driven material. Many members post nearly identical comments at almost the same time. The text includes markers of automatic translation while error messages included as profile photos indicate the automated pulling of stock images.”</i><br><br> In this example analysts found an indicator of automated use of stock images in Facebook accounts; some of the accounts in the network appeared to have mistakenly uploaded error messages as profile pictures (T0145.007: Stock Image Account Imagery). The text posted by the accounts also appeared to have been translated using automation (T0085.008: Machine Translated Text). |
|
||||
|
||||
|
||||
|
||||
|
@ -8,7 +8,7 @@
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00068 Attempted Audio Deepfake Call Targets LastPass Employee](../../generated_pages/incidents/I00068.md) | <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br> In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0043.001: Use Encrypted Chat Apps) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <I>“[Iranian state-sponsored cyber espionage actor] APT42 cloud operations attack lifecycle can be described in details as follows:<br> <br>- “Social engineering schemes involving decoys and trust building, which includes masquerading as legitimate NGOs and conducting ongoing correspondence with the target, sometimes lasting several weeks. <br>- The threat actor masqueraded as well-known international organizations in the legal and NGO fields and sent emails from domains typosquatting the original NGO domains, for example aspenlnstitute[.]org. <br>- The Aspen Institute became aware of this spoofed domain and collaborated with industry partners, including blocking it in SafeBrowsing, thus protecting users of Google Chrome and additional browsers. <br>- To increase their credibility, APT42 impersonated high-ranking personnel working at the aforementioned organizations when creating the email personas. <br>- APT42 enhanced their campaign credibility by using decoy material inviting targets to legitimate and relevant events and conferences. In one instance, the decoy material was hosted on an attacker-controlled SharePoint folder, accessible only after the victim entered their credentials. Mandiant did not identify malicious elements in the files, suggesting they were used solely to gain the victim’s trust.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, created a domain impersonating the existing NGO The Aspen Institute (T00143.004: Impersonated Persona, T0097.207: NGO Persona). They increased the perceived legitimacy of the impersonation by also impersonating high-ranking employees of the NGO (T0097.100: Individual Persona, T00143.004: Impersonated Persona). |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <I>“[Iranian state-sponsored cyber espionage actor] APT42 cloud operations attack lifecycle can be described in details as follows:<br> <br>- “Social engineering schemes involving decoys and trust building, which includes masquerading as legitimate NGOs and conducting ongoing correspondence with the target, sometimes lasting several weeks. <br>- The threat actor masqueraded as well-known international organizations in the legal and NGO fields and sent emails from domains typosquatting the original NGO domains, for example aspenlnstitute[.]org. <br>- The Aspen Institute became aware of this spoofed domain and collaborated with industry partners, including blocking it in SafeBrowsing, thus protecting users of Google Chrome and additional browsers. <br>- To increase their credibility, APT42 impersonated high-ranking personnel working at the aforementioned organizations when creating the email personas. <br>- APT42 enhanced their campaign credibility by using decoy material inviting targets to legitimate and relevant events and conferences. In one instance, the decoy material was hosted on an attacker-controlled SharePoint folder, accessible only after the victim entered their credentials. Mandiant did not identify malicious elements in the files, suggesting they were used solely to gain the victim’s trust.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, created a domain impersonating the existing NGO The Aspen Institute (T0143.003: Impersonated Persona, T0097.207: NGO Persona). They increased the perceived legitimacy of the impersonation by also impersonating high-ranking employees of the NGO (T0097.100: Individual Persona, T0143.003: Impersonated Persona). |
|
||||
|
||||
|
||||
|
||||
|
@ -11,8 +11,7 @@
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“In addition to directly posting material on social media, we observed some personas in the network [of inauthentic accounts attributed to Iran] leverage legitimate print and online media outlets in the U.S. and Israel to promote Iranian interests via the submission of letters, guest columns, and blog posts that were then published. We also identified personas that we suspect were fabricated for the sole purpose of submitting such letters, but that do not appear to maintain accounts on social media. The personas claimed to be based in varying locations depending on the news outlets they were targeting for submission; for example, a persona that listed their location as Seattle, WA in a letter submitted to the Seattle Times subsequently claimed to be located in Baytown, TX in a letter submitted to The Baytown Sun. Other accounts in the network then posted links to some of these letters on social media.”</i><br><br> In this example actors fabricated individuals who lived in areas which were being targeted for influence through the use of letters to local papers (T0097.101: Local Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00078 Meta’s September 2020 Removal of Coordinated Inauthentic Behavior](../../generated_pages/incidents/I00078.md) | <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00081 Belarus KGB created fake accounts to criticize Poland during border crisis, Facebook parent company says](../../generated_pages/incidents/I00081.md) | <i>“Meta said it also removed 31 Facebook accounts, four groups, two events and four Instagram accounts that it believes originated in Poland and targeted Belarus and Iraq. Those allegedly fake accounts posed as Middle Eastern migrants posting about the border crisis. Meta did not link the accounts to a specific group.<br><br> ““These fake personas claimed to be sharing their own negative experiences of trying to get from Belarus to Poland and posted about migrants’ difficult lives in Europe,” Meta said. “They also posted about Poland’s strict anti-migrant policies and anti-migrant neo-Nazi activity in Poland. They also shared links to news articles criticizing the Belarusian government’s handling of the border crisis and off-platform videos alleging migrant abuse in Europe.””</i><br><br> In this example accounts falsely presented themselves as having local insight into the border crisis narrative (T0097.101: Local Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | Accounts which were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023” were presenting themselves as locals to Israel (T0097.101: Local Persona):<br><br><i>“Unlike usual low-effort fake accounts, these accounts meticulously mimic young Israelis. They stand out due to the extraordinary lengths taken to ensure their authenticity, from unique narratives to the content they produce to their seemingly authentic interactions.”<I>
|
||||
|
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | Accounts which were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023” were presenting themselves as locals to Israel (T0097.101: Local Persona):<br><br><i>“Unlike usual low-effort fake accounts, these accounts meticulously mimic young Israelis. They stand out due to the extraordinary lengths taken to ensure their authenticity, from unique narratives to the content they produce to their seemingly authentic interactions.”<I> |
|
||||
| [I00087 Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation](../../generated_pages/incidents/I00087.md) | <i>“Another actor operating in China is the American-based company Devumi. Most of the Twitter accounts managed by Devumi resemble real people, and some are even associated with a kind of large-scale social identity theft. At least 55,000 of the accounts use the names, profile pictures, hometowns and other personal details of real Twitter users, including minors, according to The New York Times (Confessore et al., 2018)).”</i><br><br>In this example accounts impersonated real locals while spreading operation narratives (T0143.003: Impersonated Persona, T0097.101: Local Persona). The impersonation included stealing the legitimate accounts’ profile pictures (T0145.001: Copy Account Imagery). |
|
||||
|
||||
|
||||
|
@ -10,8 +10,7 @@
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <I>“In March 2023, [Iranian state-sponsored cyber espionage actor] APT42 sent a spear-phishing email with a fake Google Meet invitation, allegedly sent on behalf of Mona Louri, a likely fake persona leveraged by APT42, claiming to be a human rights activist and researcher. Upon entry, the user was presented with a fake Google Meet page and asked to enter their credentials, which were subsequently sent to the attackers.”</i><br><br>In this example APT42, an Iranian state-sponsored cyber espionage actor, created an account which presented as a human rights activist (T0097.103: Activist Persona) and researcher (T0097.107: Researcher Persona). The analysts assert that it was likely the persona was fabricated (T0143.002: Fabricated Persona) |
|
||||
| [I00077 Fronts & Friends: An Investigation into Two Twitter Networks Linked to Russian Actors](../../generated_pages/incidents/I00077.md) | <i>“The Syria portion of the network [of inauthentic accounts attributed to Russia] included additional sockpuppet accounts. One of these claimed to be a gay rights defender in Syria. Several said they were Syrian journalists. Another account, @SophiaHammer3, said she was born in Syria but currently lives in London. “I’m fond of history and politics. I struggle for justice.” Twitter users had previously observed that Sophia was likely a sockpuppet.”</i><br><br> This behaviour matches T0097.103: Activist Persona because the account presents itself as defending a political cause - in this case gay rights.<br><br> Twitter’s technical indicators allowed their analysts to assert that these accounts were “reliably tied to Russian state actors”, meaning the presented personas were entirely fabricated (T0143.002: Fabricated Persona); these accounts are not legitimate gay rights defenders or journalists, they’re assets controlled by Russia publishing narratives beneficial to their agenda. |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00082 Meta’s November 2021 Adversarial Threat Report ](../../generated_pages/incidents/I00082.md) | <I>“[Meta] removed 41 Facebook accounts, five Groups, and four Instagram accounts for violating our policy against coordinated inauthentic behavior. This activity originated in Belarus and primarily targeted audiences in the Middle East and Europe.<br><br> “The core of this activity began in October 2021, with some accounts created as recently as mid-November. The people behind it used newly-created fake accounts — many of which were detected and disabled by our automated systems soon after creation — to pose as journalists and activists from the European Union, particularly Poland and Lithuania. Some of the accounts used profile photos likely generated using artificial intelligence techniques like generative adversarial networks (GAN). These fictitious personas posted criticism of Poland in English, Polish, and Kurdish, including pictures and videos about Polish border guards allegedly violating migrants’ rights, and compared Poland’s treatment of migrants against other countries’. They also posted to Groups focused on the welfare of migrants in Europe. A few accounts posted in Russian about relations between Belarus and the Baltic States.”</i><br><br> This example shows how accounts identified as participating in coordinated inauthentic behaviour were presenting themselves as journalists and activists while spreading operation narratives (T0097.102: Journalist Persona, T0097.103: Activist Persona).<br><br> Additionally, analysts at Meta identified accounts which were participating in coordinated inauthentic behaviour that had likely used AI-Generated images as their profile pictures (T0145.002: AI-Generated Account Imagery)., <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content)
|
||||
|
|
||||
| [I00082 Meta’s November 2021 Adversarial Threat Report ](../../generated_pages/incidents/I00082.md) | <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content). |
|
||||
|
||||
|
||||
|
||||
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -7,10 +7,10 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T00143.004: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T0143.003: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [I00074 The Tactics & Tropes of the Internet Research Agency](../../generated_pages/incidents/I00074.md) | <i>“The Black Matters Facebook Page [operated by Russia’s Internet Research Agency] explored several visual brand identities, moving from a plain logo to a gothic typeface on Jan 19th, 2016. On February 4th, 2016, the person who ran the Facebook Page announced the launch of the website, blackmattersus[.]com, emphasizing media distrust and a desire to build Black independent media; [“I DIDN’T BELIEVE THE MEDIA / SO I BECAME ONE”]”<i><br><br> In this example an asset controlled by Russia’s Internet Research Agency began to present itself as a source of “Black independent media”, claiming that the media could not be trusted (T0097.208: Social Cause Persona, T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Accounts in the network [of inauthentic accounts attributed to Iran], under the guise of journalist personas, also solicited various individuals over Twitter for interviews and chats, including real journalists and politicians. The personas appear to have successfully conducted remote video and audio interviews with U.S. and UK-based individuals, including a prominent activist, a radio talk show host, and a former U.S. Government official, and subsequently posted the interviews on social media, showing only the individual being interviewed and not the interviewer. The interviewees expressed views that Iran would likely find favorable, discussing topics such as the February 2019 Warsaw summit, an attack on a military parade in the Iranian city of Ahvaz, and the killing of Jamal Khashoggi.<br><br> “The provenance of these interviews appear to have been misrepresented on at least one occasion, with one persona appearing to have falsely claimed to be operating on behalf of a mainstream news outlet; a remote video interview with a US-based activist about the Jamal Khashoggi killing was posted by an account adopting the persona of a journalist from the outlet Newsday, with the Newsday logo also appearing in the video. We did not identify any Newsday interview with the activist in question on this topic. In another instance, a persona posing as a journalist directed tweets containing audio of an interview conducted with a former U.S. Government official at real media personalities, calling on them to post about the interview.”</i><br><br> In this example actors fabricated journalists (T0097.102: Journalist Persona, T0143.002: Fabricated Persona) who worked at existing news outlets (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona) in order to conduct interviews with targeted individuals. |
|
||||
| [I00077 Fronts & Friends: An Investigation into Two Twitter Networks Linked to Russian Actors](../../generated_pages/incidents/I00077.md) | <i>“Approximately one-third of the suspended accounts [in the network of inauthentic accounts attributed to Russia] tweeted primarily about Syria, in English, Russian, and Arabic; many accounts tweeted in all three languages. The themes these accounts pushed will be familiar to anyone who has studied Russian overt or covert information operations about Syria: <br> <br>- Praising Russia’s role in Syria; claiming Russia was killing terrorists in Syria and highlighting Russia’s humanitarian aid <br>- Criticizing the role of the Turkey and the US in Syria; claiming the US killed civilians in Syria <br>- Criticizing the White Helmets, and claiming that they worked with Westerners to created scenes to make it look like the Syrian government used chemical weapons <br><br> “The two most prominent Syria accounts were @Syria_FreeNews and @PamSpenser. <br><br> “@Syria_FreeNews had 20,505 followers and was created on April 6, 2017. The account’s bio said “Exclusive information about Middle East and Northern Africa countries events. BreaKing news from the scene.””</i><br><br> This behaviour matches T0097.202: News Outlet Persona because the account @Syrira_FreeNews presented itself as a news outlet in its name, bio, and branding, across all websites on which the persona had been established (T0144.001: Persona Presented across Platforms). Twitter’s technical indicators allowed them to attribute the account “can be reliably tied to Russian state actors”. Because of this we can assert that the persona is entirely fabricated (T0143.002: Fabricated Persona); this is not a legitimate news outlet providing information about Syria, it’s an asset controlled by Russia publishing narratives beneficial to their agenda., <i>“Two accounts [in the second network of accounts taken down by Twitter] appear to have been operated by Oriental Review and the Strategic Culture Foundation, respectively. Oriental Review bills itself as an “open source site for free thinking”, though it trades in outlandish conspiracy theories and posts content bylined by fake people. Stanford Internet Observatory researchers and investigative journalists have previously noted the presence of content bylined by fake “reporter” personas tied to the GRU-linked front Inside Syria Media Center, posted on Oriental Review.”</i><br><br> In an effort to make the Oriental Review’s stories appear more credible, the threat actors created fake journalists and pretended they wrote the articles on their website (aka “bylined” them).<br><br> In DISARM terms, they fabricated journalists (T0143.002: Fabricated Persona, T0097.003: Journalist Persona), and then used these fabricated journalists to increase perceived legitimacy (T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00077 Fronts & Friends: An Investigation into Two Twitter Networks Linked to Russian Actors](../../generated_pages/incidents/I00077.md) | <i>“Two accounts [in the second network of accounts taken down by Twitter] appear to have been operated by Oriental Review and the Strategic Culture Foundation, respectively. Oriental Review bills itself as an “open source site for free thinking”, though it trades in outlandish conspiracy theories and posts content bylined by fake people. Stanford Internet Observatory researchers and investigative journalists have previously noted the presence of content bylined by fake “reporter” personas tied to the GRU-linked front Inside Syria Media Center, posted on Oriental Review.”</i><br><br> In an effort to make the Oriental Review’s stories appear more credible, the threat actors created fake journalists and pretended they wrote the articles on their website (aka “bylined” them).<br><br> In DISARM terms, they fabricated journalists (T0143.002: Fabricated Persona, T0097.003: Journalist Persona), and then used these fabricated journalists to increase perceived legitimacy (T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“On January 4 [2017], a little-known news site based in Donetsk, Ukraine published an article claiming that the United States was sending 3,600 tanks to Europe as part of “the NATO war preparation against Russia”.<br><br> “Like much fake news, this story started with a grain of truth: the US was about to reinforce its armored units in Europe. However, the article converted literally thousands of other vehicles — including hundreds of Humvees and trailers — into tanks, building the US force into something 20 times more powerful than it actually was.<br><br> “The story caught on online. Within three days it had been repeated by a dozen websites in the United States, Canada and Europe, and shared some 40,000 times. It was translated into Norwegian; quoted, unchallenged, by Russian state news agency RIA Novosti; and spread among Russian-language websites.<br><br> “It was also an obvious fake, as any Google news search would have revealed. Yet despite its evident falsehood, it spread widely, and not just in directly Kremlin-run media. Tracking the spread of this fake therefore shines a light on the wider question of how fake stories are dispersed.”</i><br><br> Russian state news agency RIA Novosti presents themselves as a news outlet (T0097.202: News Outlet Persona). RIO Novosti is a real news outlet (T0143.001: Authentic Persona), but it did not carry out a basic investigation into the veracity of the narrative they published implicitly expected of institutions presenting themselves as news outlets.<br><br> We can’t know how or why this narrative ended up being published by RIA Novosti, but we know that it presented a distorted reality as authentic information (T0023: Distort Facts), claiming that the US was sending 3,600 tanks, instead of 3,600 vehicles which included ~180 tanks. |
|
||||
| [I00094 A glimpse inside a Chinese influence campaign: How bogus news websites blur the line between true and false](../../generated_pages/incidents/I00094.md) | Researchers identified websites managed by a Chinese marketing firm which presented themselves as news organisations.<br><br> <i>“On its official website, the Chinese marketing firm boasted that they were in contact with news organizations across the globe, including one in South Korea called the “Chungcheng Times.” According to the joint team, this outlet is a fictional news organization created by the offending company. The Chinese company sought to disguise the site’s true identity and purpose by altering the name attached to it by one character—making it very closely resemble the name of a legitimate outlet operating out of Chungchengbuk-do.<br><br> “The marketing firm also established a news organization under the Korean name “Gyeonggido Daily,” which closely resembles legitimate news outlets operating out of Gyeonggi province such as “Gyeonggi Daily,” “Daily Gyeonggi Newspaper,” and “Gyeonggi N Daily.” One of the fake news sites was named “Incheon Focus,” a title that could be easily mistaken for the legitimate local news outlet, “Focus Incheon.” Furthermore, the Chinese marketing company operated two fake news sites with names identical to two separate local news organizations, one of which ceased operations in December 2022.<br><br> “In total, fifteen out of eighteen Chinese fake news sites incorporated the correct names of real regions in their fake company names. “If the operators had created fake news sites similar to major news organizations based in Seoul, however, the intended deception would have easily been uncovered,” explained Song Tae-eun, an assistant professor in the Department of National Security & Unification Studies at the Korea National Diplomatic Academy, to The Readable. “There is also the possibility that they are using the regional areas as an attempt to form ties with the local community; that being the government, the private sector, and religious communities.””</i><br><br> The firm styled their news site to resemble existing local news outlets in their target region (T0097.201: Local Institution Persona, T0097.202: News Outlet Persona, T0143.003: Impersonated Persona). |
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
# Technique T0097.204: Think Tank Persona
|
||||
|
||||
* **Summary**: An institution with a think tank persona presents itself as a think tank; an organisation that aims to conduct original research and propose new policies or solutions, especially for social and scientific problems.<br><br> While presenting as a think tank is not an indication of inauthentic behaviour, think tank personas are commonly used by threat actors as a front for their operational activity (T0143.002: Fabricated Persona, T0097.204: Think Tank Persona). They may be created to give legitimacy to narratives and allow them to suggest politically beneficial solutions to societal issues.<br><br> Legitimate think tanks could have a political bias that they may not be transparent about, they could use their persona for malicious purposes, or they could be exploited by threat actors (T0143.001: Authentic Persona, T0097.204: Think Tank Persona). For example, a think tank could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.107: Researcher Persona:</br> Institutions presenting as think tanks may also present researchers working within the organisation.
|
||||
* **Summary**: An institution with a think tank persona presents itself as a think tank; an organisation that aims to conduct original research and propose new policies or solutions, especially for social and scientific problems.<br><br> While presenting as a think tank is not an indication of inauthentic behaviour, think tank personas are commonly used by threat actors as a front for their operational activity (T0143.002: Fabricated Persona, T0097.204: Think Tank Persona). They may be created to give legitimacy to narratives and allow them to suggest politically beneficial solutions to societal issues.<br><br> Legitimate think tanks could have a political bias that they may not be transparent about, they could use their persona for malicious purposes, or they could be exploited by threat actors (T0143.001: Authentic Persona, T0097.204: Think Tank Persona). For example, a think tank could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.107: Researcher Persona:</b> Institutions presenting as think tanks may also present researchers working within the organisation.
|
||||
|
||||
* **Belongs to tactic stage**: TA16
|
||||
|
||||
|
File diff suppressed because one or more lines are too long
@ -7,7 +7,7 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T00143.004: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <I>“[Iranian state-sponsored cyber espionage actor] APT42 cloud operations attack lifecycle can be described in details as follows:<br> <br>- “Social engineering schemes involving decoys and trust building, which includes masquerading as legitimate NGOs and conducting ongoing correspondence with the target, sometimes lasting several weeks. <br>- The threat actor masqueraded as well-known international organizations in the legal and NGO fields and sent emails from domains typosquatting the original NGO domains, for example aspenlnstitute[.]org. <br>- The Aspen Institute became aware of this spoofed domain and collaborated with industry partners, including blocking it in SafeBrowsing, thus protecting users of Google Chrome and additional browsers. <br>- To increase their credibility, APT42 impersonated high-ranking personnel working at the aforementioned organizations when creating the email personas. <br>- APT42 enhanced their campaign credibility by using decoy material inviting targets to legitimate and relevant events and conferences. In one instance, the decoy material was hosted on an attacker-controlled SharePoint folder, accessible only after the victim entered their credentials. Mandiant did not identify malicious elements in the files, suggesting they were used solely to gain the victim’s trust.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, created a domain impersonating the existing NGO The Aspen Institute (T0143.003: Impersonated Persona, T0097.207: NGO Persona). They increased the perceived legitimacy of the impersonation by also impersonating high-ranking employees of the NGO (T0097.100: Individual Persona, T0143.003: Impersonated Persona). |
|
||||
|
||||
|
||||
|
||||
|
@ -7,11 +7,7 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00064 Tinder nightmares: the promise and peril of political bots](../../generated_pages/incidents/I00064.md) | _"In the days leading up to the UK’s [2017] general election, youths looking for love online encountered a whole new kind of Tinder nightmare. A group of young activists built a Tinder chatbot to co-opt profiles and persuade swing voters to support Labour. The bot accounts sent 30,000-40,000 messages to targeted 18-25 year olds in battleground constituencies like Dudley North, which Labour ended up winning by only 22 votes._ <br /> <br />
|
||||
|
||||
_"Tinder is a dating app where users swipe right to indicate attraction and interest in a potential partner. If both people swipe right on each other’s profile, a dialogue box becomes available for them to privately chat. After meeting their crowdfunding goal of only £500, the team built a tool which took over and operated the accounts of recruited Tinder-users. By upgrading the profiles to Tinder Premium, the team was able to place bots in any contested constituency across the UK. Once planted, the bots swiped right on all users in the attempt to get the largest number of matches and inquire into their voting intentions."_ <br /> <br />
|
||||
|
||||
This incident matches T0104.002: Dating App, as users of Tinder were targeted in an attempt to persuade users to vote for a particular party in the upcoming election, rather than for the purpose of connecting those who were authentically interested in dating each other. |
|
||||
| [I00064 Tinder nightmares: the promise and peril of political bots](../../generated_pages/incidents/I00064.md) | _"In the days leading up to the UK’s [2017] general election, youths looking for love online encountered a whole new kind of Tinder nightmare. A group of young activists built a Tinder chatbot to co-opt profiles and persuade swing voters to support Labour. The bot accounts sent 30,000-40,000 messages to targeted 18-25 year olds in battleground constituencies like Dudley North, which Labour ended up winning by only 22 votes._<br /> <br />_"Tinder is a dating app where users swipe right to indicate attraction and interest in a potential partner. If both people swipe right on each other’s profile, a dialogue box becomes available for them to privately chat. After meeting their crowdfunding goal of only £500, the team built a tool which took over and operated the accounts of recruited Tinder-users. By upgrading the profiles to Tinder Premium, the team was able to place bots in any contested constituency across the UK. Once planted, the bots swiped right on all users in the attempt to get the largest number of matches and inquire into their voting intentions."_ <br /> <br />This incident matches T0104.002: Dating App, as users of Tinder were targeted in an attempt to persuade users to vote for a particular party in the upcoming election, rather than for the purpose of connecting those who were authentically interested in dating each other. |
|
||||
|
||||
|
||||
|
||||
|
@ -7,8 +7,7 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00082 Meta’s November 2021 Adversarial Threat Report ](../../generated_pages/incidents/I00082.md) | <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content)
|
||||
|
|
||||
| [I00082 Meta’s November 2021 Adversarial Threat Report ](../../generated_pages/incidents/I00082.md) | <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content). |
|
||||
|
||||
|
||||
|
||||
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -9,7 +9,7 @@
|
||||
| -------- | -------------------- |
|
||||
| [I00064 Tinder nightmares: the promise and peril of political bots](../../generated_pages/incidents/I00064.md) | <I>“In the days leading up to the UK’s [2019] general election, youths looking for love online encountered a whole new kind of Tinder nightmare. A group of young activists built a Tinder chatbot to co-opt profiles and persuade swing voters to support Labour. The bot accounts sent 30,000-40,000 messages to targeted 18-25 year olds in battleground constituencies like Dudley North, which Labour ended up winning by only 22 votes. [...]<br><br> “The activists maintain that the project was meant to foster democratic engagement. But screenshots of the bots’ activity expose a harsher reality. Images of conversations between real users and these bots, posted on i-D, Mashable, as well as on Fowler and Goodman’s public Twitter accounts, show that the bots did not identify themselves as automated accounts, instead posing as the user whose profile they had taken over. While conducting research for this story, it turned out that a number of [the reporters’ friends] living in Oxford had interacted with the bot in the lead up to the election and had no idea that it was not a real person.”</i><br><br> In this example people offered up their real accounts for the automation of political messaging; the actors convinced the users to give up access to their accounts to use in the operation (T0141.001: Acquire Compromised Account). The actors maintained the accounts’ existing persona, and presented themselves as potential romantic suitors for legitimate platform users (T0097:109 Romantic Suitor Persona, T0143.003: Impersonated Persona). |
|
||||
| [I00065 'Ghostwriter' Influence Campaign: Unknown Actors Leverage Website Compromises and Fabricated Content to Push Narratives Aligned With Russian Security Interests](../../generated_pages/incidents/I00065.md) | _”Overall, narratives promoted in the five operations appear to represent a concerted effort to discredit the ruling political coalition, widen existing domestic political divisions and project an image of coalition disunity in Poland. In each incident, content was primarily disseminated via Twitter, Facebook, and/ or Instagram accounts belonging to Polish politicians, all of whom have publicly claimed their accounts were compromised at the times the posts were made."_ <br /> <br />This example demonstrates how threat actors can use _T0141.001: Acquire Compromised Account_ to distribute inauthentic content while exploiting the legitimate account holder’s persona. |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T00143.004: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <i>“Mandiant identified at least three clusters of infrastructure used by [Iranian state-sponsored cyber espionage actor] APT42 to harvest credentials from targets in the policy and government sectors, media organizations and journalists, and NGOs and activists. The three clusters employ similar tactics, techniques and procedures (TTPs) to target victim credentials (spear-phishing emails), but use slightly varied domains, masquerading patterns, decoys, and themes.<br><br> Cluster A: Posing as News Outlets and NGOs: <br>- Suspected Targeting: credentials of journalists, researchers, and geopolitical entities in regions of interest to Iran. <br>- Masquerading as: The Washington Post (U.S.), The Economist (UK), The Jerusalem Post (IL), Khaleej Times (UAE), Azadliq (Azerbaijan), and more news outlets and NGOs. This often involves the use of typosquatted domains like washinqtonpost[.]press. <br><br>“Mandiant did not observe APT42 target or compromise these organizations, but rather impersonate them.”</I><br><br> In this example APT42, an Iranian state-sponsored cyber espionage actor, impersonated existing news organisations and NGOs (T0097.202 News Outlet Persona, T0097.207: NGO Persona, T0143.003: Impersonated Persona) in attempts to steal credentials from targets (T0141.001: Acquire Compromised Account), using elements of influence operations to facilitate their cyber attacks. |
|
||||
| [I00071 Russia-aligned hacktivists stir up anti-Ukrainian sentiments in Poland](../../generated_pages/incidents/I00071.md) | <i>“The August 17 [2022] Telegram post [which contained a falsified letter from the Ukrainian Minister of Foreign Affairs asking Poland to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII] also contained screenshots of Facebook posts that appeared on two Facebook accounts belonging to Polish nationals Piotr Górka, an expert in the history of the Polish Air Force, and Dariusz Walusiak, a Polish historian and documentary maker. The Górka post suggested that he fully supported the Polish government’s decision to change Belwederska Street to Stepan Bandera Street.<br><br> “In a statement to the DFRLab, Górka said his account was accessed without his consent. “This is not my post loaded to my Facebook page,” he explained. “My site was hacked, some days ago.” At the time of publishing, Piotr Górka’s post and his Facebook account were no longer accessible.<br><br> “The post on Górka’s Facebook page was shared by Dariusz Walusiak’s Facebook account; the account also reposted it on the Facebook walls of more than twenty other Facebook users, including Adam Kalita, currently working at Krakow branch of the Institute of National Remembrance; Jan Kasprzyk, head of the Office for War Veterans and Victims of Oppression; and Alicja Kondraciuk, a Polish public figure living in Krakow.<br><br> “Walusiak’s Facebook account is also no longer accessible. Given his work on Polish history and identity, it seems highly unlikely he would support the Bandera measure; the DFRLab has also reached out to him for comment.<br><br> “The fact that Joker DPR’s Telegram post included screenshots of their Facebook posts raises the strong possibility that both Facebook accounts were compromised, and that hackers planted false statements on their pages that would seem out of character for them in order to gain further attention to the forged documents.”</I><br><br> In this example, threat actors used compromised accounts (T0141.001: Acquire Compromised Account) of Polish historians who have enough relevant knowledge to plausibly weigh in on the forged letter’s narrative (T0143.003: Impersonated Persona, T0097.101: Local Persona, T0097.108: Expert Persona). <br><br> This matches T0097.108: Expert Persona because the impersonation exploited Górka and Walusiak’s existing personas as experts in Polish history. |
|
||||
|
||||
|
||||
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -7,8 +7,7 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00067 Understanding Information disorder](../../generated_pages/incidents/I00067.md) | <i>“In France, in the lead-up to the 2017 election, we saw [the] labeling content as ‘‘satire” as a deliberate tactic. In one example, written up by Adrien Sénécat in Le Monde, it shows the step-by-step approach of those who want to use satire in this way.”<br><br> “PHASE 1: Le Gorafi, a satirical site [which focuses on news/current affairs], ‘‘reported” that French presidential candidate Emmanuel Macron feels dirty after touching poor people’s hands. This worked as an attack on Macron as he is regularly characterized as being out of touch and elitist.<br><br> “PHASE 2: Hyper-partisan Facebook Pages used this ‘‘claim” and created new reports, including footage of Macron visiting a factory, and wiping his hands during the visit.<br><br> “PHASE 3: The videos went viral, and a worker in another factory challenged Macron to shake his ‘‘dirty, working class hands.” The news cycle continued.”</I><br><br> In this example a satirical news website (T0097.202: News Outlet Persona, T0143.004: Parody Persona) published a narrative claiming Macron felt dirty after touching poor people’s hands. This story was uncritically amplified without the context that its origin was a parody site, and with video content appearing to support the narrative., <i>“A 2019 case in the US involved a Republican political operative who created a parody site designed to look like Joe Biden’s official website as the former vice president was campaigning to be the Democratic nominee for the 2020 presidential election. With a URL of joebiden[.]info, the parody site was indexed by Google higher than Biden’s official site, joebiden[.]com, when he launched his campaign in April 2019. The operative, who previously had created content for Donald Trump, said he did not create the site for the Trump campaign directly.<br><br> “The opening line on the parody site reads: “Uncle Joe is back and ready to take a hands-on approach to America’s problems!” It is full of images of Biden kissing and hugging young girls and women. At the bottom of the page a statement reads: “This site is political commentary and parody of Joe Biden’s Presidential campaign website. This is not Joe Biden’s actual website. It is intended for entertainment and political commentary only.””</i><br><br> In this example a website was created which claimed to be a parody of Joe Biden’s official website (T0143.004: Parody Persona).<br><br> Although the website was a parody, it ranked higher than Joe Biden’s real website on Google search. |
|
||||
| [I00070 Eli Lilly Clarifies It’s Not Offering Free Insulin After Tweet From Fake Verified Account—As Chaos Unfolds On Twitter](../../generated_pages/incidents/I00070.md) | <i>“Twitter Blue launched [November 2022], giving any users who pay $8 a month the ability to be verified on the site, a feature previously only available to public figures, government officials and journalists as a way to show they are who they claim to be.<br><br> “[A day after the launch], an account with the handle @EliLillyandCo labeled itself with the name “Eli Lilly and Company,” and by using the same logo as the company in its profile picture and with the verification checkmark, was indistinguishable from the real company (the picture has since been removed and the account has labeled itself as a parody profile).<br><br> The parody account tweeted “we are excited to announce insulin is free now.””</i><br><br> In this example an account impersonated the pharmaceutical company Eli Lilly (T0097.205: Business Persona, T0143.003: Impersonated Persona) by copying its name, profile picture (T0145.001: Copy Account Imagery), and paying for verification. |
|
||||
| [I00067 Understanding Information disorder](../../generated_pages/incidents/I00067.md) | <i>“A 2019 case in the US involved a Republican political operative who created a parody site designed to look like Joe Biden’s official website as the former vice president was campaigning to be the Democratic nominee for the 2020 presidential election. With a URL of joebiden[.]info, the parody site was indexed by Google higher than Biden’s official site, joebiden[.]com, when he launched his campaign in April 2019. The operative, who previously had created content for Donald Trump, said he did not create the site for the Trump campaign directly.<br><br> “The opening line on the parody site reads: “Uncle Joe is back and ready to take a hands-on approach to America’s problems!” It is full of images of Biden kissing and hugging young girls and women. At the bottom of the page a statement reads: “This site is political commentary and parody of Joe Biden’s Presidential campaign website. This is not Joe Biden’s actual website. It is intended for entertainment and political commentary only.””</i><br><br> In this example a website was created which claimed to be a parody of Joe Biden’s official website (T0143.004: Parody Persona).<br><br> Although the website was a parody, it ranked higher than Joe Biden’s real website on Google search. |
|
||||
|
||||
|
||||
|
||||
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
@ -7,7 +7,7 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00088 Much Ado About ‘Somethings’ - China-Linked Influence Operation Endures Despite Takedown](../../generated_pages/incidents/I00088.md) | "<i>“Beneath a video on Facebook about the war between Israel and Hamas, Lamonica Trout commented, “America is the war monger, the Jew’s own son!” She left identical comments beneath the same video on two other Facebook pages. Trout’s profile provides no information besides her name. It lists no friends, and there is not a single post or photograph in her feed. Trout’s profile photo shows an alligator.<br><br> “Lamonica Trout is likely an invention of the group behind Spamouflage, an ongoing, multi-year influence operation that promotes Beijing’s interests. Last year, Facebook’s parent company, Meta, took down 7,704 accounts and 954 pages it identified as part of the Spamouflage operation, which it described as the “largest known cross-platform influence operation [Meta had] disrupted to date.”2 Facebook’s terms of service prohibit a range of deceptive and inauthentic behaviors, including efforts to conceal the purpose of social media activity or the identity of those behind it.”</i><br><br> In this example an account attributed to a multi-year influence operation created the persona of Lamonica Trout in a Facebook account, which used an image of an animal in its profile picture (T0145.003: Animal Account Imagery)." |
|
||||
| [I00088 Much Ado About ‘Somethings’ - China-Linked Influence Operation Endures Despite Takedown](../../generated_pages/incidents/I00088.md) | <i>“Beneath a video on Facebook about the war between Israel and Hamas, Lamonica Trout commented, “America is the war monger, the Jew’s own son!” She left identical comments beneath the same video on two other Facebook pages. Trout’s profile provides no information besides her name. It lists no friends, and there is not a single post or photograph in her feed. Trout’s profile photo shows an alligator.<br><br> “Lamonica Trout is likely an invention of the group behind Spamouflage, an ongoing, multi-year influence operation that promotes Beijing’s interests. Last year, Facebook’s parent company, Meta, took down 7,704 accounts and 954 pages it identified as part of the Spamouflage operation, which it described as the “largest known cross-platform influence operation [Meta had] disrupted to date.”2 Facebook’s terms of service prohibit a range of deceptive and inauthentic behaviors, including efforts to conceal the purpose of social media activity or the identity of those behind it.”</i><br><br> In this example an account attributed to a multi-year influence operation created the persona of Lamonica Trout in a Facebook account, which used an image of an animal in its profile picture (T0145.003: Animal Account Imagery). |
|
||||
|
||||
|
||||
|
||||
|
@ -7,12 +7,9 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br>
|
||||
|
||||
“Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</I><br><br>
|
||||
|
||||
This behaviour matches T0145.006: Stock Image Account Imagery because the account was identified as using a stock image as its profile picture. |
|
||||
| [I00088 Much Ado About ‘Somethings’ - China-Linked Influence Operation Endures Despite Takedown](../../generated_pages/incidents/I00088.md) | <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns. [Spamouflage is a coordinated inauthentic behaviour network attributed to the Chinese state.]<br><br> “Like other components of Spamouflage, the WoS network sometimes intersperses apolitical content with its more agenda-driven material. Many members post nearly identical comments at almost the same time. The text includes markers of automatic translation while error messages included as profile photos indicate the automated pulling of stock images.”</i><br><br> In this example analysts found an indicator of automated use of stock images in Facebook accounts; some of the accounts in the network appeared to have mistakenly uploaded error messages as profile pictures (T0145.006: Stock Image Account Imagery). The text posted by the accounts also appeared to have been translated using automation (T0085.008: Machine Translated Text). |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | <i>“In the wake of the Hamas attack on October 7th, the Israel Defense Forces (IDF) Information Security Department revealed a campaign of Instagram accounts impersonating young, attractive Israeli women who were actively engaging Israeli soldiers, attempting to extract information through direct messages.<br><br> [...]<br><br> “Some profiles underwent a reverse-image search of their photos to ascertain their authenticity. Many of the images searched were found to be appropriated from genuine social media profiles or sites such as Pinterest. When this was the case, the account was marked as confirmed to be inauthentic. One innovative method involves using photos that are initially frames from videos, which allows for evading reverse searches in most cases . This is seen in Figure 4, where an image uploaded by an inauthentic account was a screenshot taken from a TikTok video.”</i><br><br> In this example accounts associated with an influence operation used account imagery showing <i>“young, attractive Israeli women”</i> (T0145.006: Attractive Person Account Imagery), with some of these assets taken from existing accounts not associated with the operation (T0145.001: Copy Account Imagery). |
|
||||
| [I00087 Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation](../../generated_pages/incidents/I00087.md) | <i>“In 2017, Tanya O'Carroll, a technology and human rights adviser for Amnesty International, published an investigation of the political impact of bots and trolls in Mexico (O’Carroll, 2017). An article by the BBC describes a video showing the operation of a "troll farm" in Mexico, where people were tweeting in support of Enrique Peña Nieto of the PRI in 2012 (Martinez, 2018).<br><br>“According to a report published by El País, the main target of parties’ online strategies are young people, including 14 million new voters who are expected to play a decisive role in the outcome of the July 2018 election (Peinado et al., 2018). Thus, one of the strategies employed by these bots was the use of profile photos of attractive people from other countries (Soloff, 2017).”</i><br><br>In this example accounts copied the profile pictures of attractive people from other countries (T0145.001: Copy Account Imagery, T0145.006: Attractive Person Account Imagery). |
|
||||
| [I00089 Hackers Use Fake Facebook Profiles of Attractive Women to Spread Viruses, Steal Passwords](../../generated_pages/incidents/I00089.md) | <I>“On Facebook, Rita, Alona and Christina appeared to be just like the millions of other U.S citizens sharing their lives with the world. They discussed family outings, shared emojis and commented on each other's photographs.<br><br> “In reality, the three accounts were part of a highly-targeted cybercrime operation, used to spread malware that was able to steal passwords and spy on victims.<br><br> “Hackers with links to Lebanon likely ran the covert scheme using a strain of malware dubbed "Tempting Cedar Spyware," according to researchers from Prague-based anti-virus company Avast, which detailed its findings in a report released on Wednesday.<br><br> “In a honey trap tactic as old as time, the culprits' targets were mostly male, and lured by fake attractive women. <br><br> “In the attack, hackers would send flirtatious messages using Facebook to the chosen victims, encouraging them to download a second , booby-trapped, chat application known as Kik Messenger to have "more secure" conversations. Upon analysis, Avast experts found that "many fell for the trap.””</i><br><br> In this example threat actors took on the persona of a romantic suitor on Facebook, directing their targets to another platform (T0097:109 Romantic Suitor Persona, T0145.006: Attractive Person Account Imagery, T0143.002: Fabricated Persona). |
|
||||
|
||||
|
||||
|
||||
|
@ -7,9 +7,8 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | <i>“In the wake of the Hamas attack on October 7th, the Israel Defense Forces (IDF) Information Security Department revealed a campaign of Instagram accounts impersonating young, attractive Israeli women who were actively engaging Israeli soldiers, attempting to extract information through direct messages.<br><br> [...]<br><br> “Some profiles underwent a reverse-image search of their photos to ascertain their authenticity. Many of the images searched were found to be appropriated from genuine social media profiles or sites such as Pinterest. When this was the case, the account was marked as confirmed to be inauthentic. One innovative method involves using photos that are initially frames from videos, which allows for evading reverse searches in most cases . This is seen in Figure 4, where an image uploaded by an inauthentic account was a screenshot taken from a TikTok video.”</i><br><br> In this example accounts associated with an influence operation used account imagery showing <i>“young, attractive Israeli women”</i> (T0145.007: Attractive Person Account Imagery), with some of these assets taken from existing accounts not associated with the operation (T0145.001: Copy Account Imagery). |
|
||||
| [I00087 Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation](../../generated_pages/incidents/I00087.md) | <i>“In 2017, Tanya O'Carroll, a technology and human rights adviser for Amnesty International, published an investigation of the political impact of bots and trolls in Mexico (O’Carroll, 2017). An article by the BBC describes a video showing the operation of a "troll farm" in Mexico, where people were tweeting in support of Enrique Peña Nieto of the PRI in 2012 (Martinez, 2018).<br><br>“According to a report published by El País, the main target of parties’ online strategies are young people, including 14 million new voters who are expected to play a decisive role in the outcome of the July 2018 election (Peinado et al., 2018). Thus, one of the strategies employed by these bots was the use of profile photos of attractive people from other countries (Soloff, 2017).”</i><br><br>In this example accounts copied the profile pictures of attractive people from other countries (T0145.001: Copy Account Imagery, T0145.007: Attractive Person Account Imagery). |
|
||||
| [I00089 Hackers Use Fake Facebook Profiles of Attractive Women to Spread Viruses, Steal Passwords](../../generated_pages/incidents/I00089.md) | <I>“On Facebook, Rita, Alona and Christina appeared to be just like the millions of other U.S citizens sharing their lives with the world. They discussed family outings, shared emojis and commented on each other's photographs.<br><br> “In reality, the three accounts were part of a highly-targeted cybercrime operation, used to spread malware that was able to steal passwords and spy on victims.<br><br> “Hackers with links to Lebanon likely ran the covert scheme using a strain of malware dubbed "Tempting Cedar Spyware," according to researchers from Prague-based anti-virus company Avast, which detailed its findings in a report released on Wednesday.<br><br> “In a honey trap tactic as old as time, the culprits' targets were mostly male, and lured by fake attractive women. <br><br> “In the attack, hackers would send flirtatious messages using Facebook to the chosen victims, encouraging them to download a second , booby-trapped, chat application known as Kik Messenger to have "more secure" conversations. Upon analysis, Avast experts found that "many fell for the trap.””</i><br><br> In this example threat actors took on the persona of a romantic suitor on Facebook, directing their targets to another platform (T0097:109 Romantic Suitor Persona, T0145.007: Attractive Person Account Imagery, T0143.002: Fabricated Persona). |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br>“Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</I><br><br>This behaviour matches T0145.007: Stock Image Account Imagery because the account was identified as using a stock image as its profile picture. |
|
||||
| [I00088 Much Ado About ‘Somethings’ - China-Linked Influence Operation Endures Despite Takedown](../../generated_pages/incidents/I00088.md) | <i>“The broader War of Somethings (WoS) network, so dubbed because all the Facebook pages and user accounts in the network are connected to “The War of Somethings” page, behaves very similarly to previous Spamouflage campaigns. [Spamouflage is a coordinated inauthentic behaviour network attributed to the Chinese state.]<br><br> “Like other components of Spamouflage, the WoS network sometimes intersperses apolitical content with its more agenda-driven material. Many members post nearly identical comments at almost the same time. The text includes markers of automatic translation while error messages included as profile photos indicate the automated pulling of stock images.”</i><br><br> In this example analysts found an indicator of automated use of stock images in Facebook accounts; some of the accounts in the network appeared to have mistakenly uploaded error messages as profile pictures (T0145.007: Stock Image Account Imagery). The text posted by the accounts also appeared to have been translated using automation (T0085.008: Machine Translated Text). |
|
||||
|
||||
|
||||
|
||||
|
@ -922,7 +922,7 @@
|
||||
<tr>
|
||||
<td><a href="techniques/T0097.111.md">T0097.111</a></td>
|
||||
<td>Government Official Persona</td>
|
||||
<td>A person who presents as an active or previous government official has the government official persona. These are officials serving in government, such as heads of government departments, leaders of countries, and members of government selected to represent constituents.<br><br> Presenting as a government official is not an indication of inauthentic behaviour, however threat actors may fabricate individuals who work in government to add credibility to their narratives (T0143.002: Fabricated Persona, T0097.111: Government Official Persona). They may also impersonate existing members of government (T0143.003: Impersonated Persona, T0097.111: Government Official Persona).<br><br> Legitimate government officials could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.111: Government Official Persona). For example, a government official could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.110: Party Official Persona: Analysts should use this sub-technique to catalogue cases where an individual is presenting as a member of a political party. <br><br> Not all government officials are political party officials (such as outside experts brought into government) and not all political party officials are government officials (such as people standing for office who are not yet working in government).<br><br> <b>T0097.206: Government Institution Persona:</b> People presenting as members of a government may also represent a government institution which they are associated with.<br><br> <b>T0097.112: Government Employee Persona:</b> Analysts should use this sub-technique to document people presenting as professionals hired to serve in government institutions and departments, not officials selected to represent constituents, or assigned official roles in government (such as heads of departments).</td>
|
||||
<td>A person who presents as an active or previous government official has the government official persona. These are officials serving in government, such as heads of government departments, leaders of countries, and members of government selected to represent constituents.<br><br> Presenting as a government official is not an indication of inauthentic behaviour, however threat actors may fabricate individuals who work in government to add credibility to their narratives (T0143.002: Fabricated Persona, T0097.111: Government Official Persona). They may also impersonate existing members of government (T0143.003: Impersonated Persona, T0097.111: Government Official Persona).<br><br> Legitimate government officials could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.111: Government Official Persona). For example, a government official could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.110: Party Official Persona:</b> Analysts should use this sub-technique to catalogue cases where an individual is presenting as a member of a political party. <br><br> Not all government officials are political party officials (such as outside experts brought into government) and not all political party officials are government officials (such as people standing for office who are not yet working in government).<br><br> <b>T0097.206: Government Institution Persona:</b> People presenting as members of a government may also represent a government institution which they are associated with.<br><br> <b>T0097.112: Government Employee Persona:</b> Analysts should use this sub-technique to document people presenting as professionals hired to serve in government institutions and departments, not officials selected to represent constituents, or assigned official roles in government (such as heads of departments).</td>
|
||||
<td>TA16</td>
|
||||
</tr>
|
||||
<tr>
|
||||
@ -958,7 +958,7 @@
|
||||
<tr>
|
||||
<td><a href="techniques/T0097.204.md">T0097.204</a></td>
|
||||
<td>Think Tank Persona</td>
|
||||
<td>An institution with a think tank persona presents itself as a think tank; an organisation that aims to conduct original research and propose new policies or solutions, especially for social and scientific problems.<br><br> While presenting as a think tank is not an indication of inauthentic behaviour, think tank personas are commonly used by threat actors as a front for their operational activity (T0143.002: Fabricated Persona, T0097.204: Think Tank Persona). They may be created to give legitimacy to narratives and allow them to suggest politically beneficial solutions to societal issues.<br><br> Legitimate think tanks could have a political bias that they may not be transparent about, they could use their persona for malicious purposes, or they could be exploited by threat actors (T0143.001: Authentic Persona, T0097.204: Think Tank Persona). For example, a think tank could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.107: Researcher Persona:</br> Institutions presenting as think tanks may also present researchers working within the organisation.</td>
|
||||
<td>An institution with a think tank persona presents itself as a think tank; an organisation that aims to conduct original research and propose new policies or solutions, especially for social and scientific problems.<br><br> While presenting as a think tank is not an indication of inauthentic behaviour, think tank personas are commonly used by threat actors as a front for their operational activity (T0143.002: Fabricated Persona, T0097.204: Think Tank Persona). They may be created to give legitimacy to narratives and allow them to suggest politically beneficial solutions to societal issues.<br><br> Legitimate think tanks could have a political bias that they may not be transparent about, they could use their persona for malicious purposes, or they could be exploited by threat actors (T0143.001: Authentic Persona, T0097.204: Think Tank Persona). For example, a think tank could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.107: Researcher Persona:</b> Institutions presenting as think tanks may also present researchers working within the organisation.</td>
|
||||
<td>TA16</td>
|
||||
</tr>
|
||||
<tr>
|
||||
|
Loading…
x
Reference in New Issue
Block a user