# Incident I00099: More Women Are Facing The Reality Of Deepfakes, And They’re Ruining Lives
* **Summary:** <i>Last October, British writer Helen was alerted to a series of deepfakes on a porn site that appear to show her engaging in extreme acts of sexual violence. That night, the images replayed themselves over and over in horrific nightmares and she was gripped by an all-consuming feeling of dread. “It’s like you’re in a tunnel, going further and further into this enclosed space, where there’s no light,” she tells Vogue. This feeling pervaded Helen’s life. Whenever she left the house, she felt exposed. On runs, she experienced panic attacks. Helen still has no idea who did this to her.</i>
| [T0086.002 Develop AI-Generated Images (Deepfakes)](../../generated_pages/techniques/T0086.002.md) | IT00000364 <i>Last October, British writer Helen was alerted to a series of deepfakes on a porn site that appear to show her engaging in extreme acts of sexual violence. That night, the images replayed themselves over and over in horrific nightmares and she was gripped by an all-consuming feeling of dread. “It’s like you’re in a tunnel, going further and further into this enclosed space, where there’s no light,” she tells Vogue. This feeling pervaded Helen’s life. Whenever she left the house, she felt exposed. On runs, she experienced panic attacks. Helen still has no idea who did this to her.<br><br>[...]<br><br>Amnesty International has been investigating the effects of abuse against women on Twitter, specifically in relation to how they act online thereafter. According to the charity, abuse creates what they’ve called “the silencing effect” whereby women feel discouraged from participating online. The same can be said for victims of deepfakes.<br><br>Helen has never been afraid to use her voice, writing deeply personal accounts of postnatal depression. But the deepfakes created a feeling of shame so strong she thought she’d be carrying this “dirty secret” forever, and so she stopped writing.<br><br>[...]<br><br>Meanwhile, deepfake ‘communities’ are thriving. There are now dedicated sites, user-friendly apps and organised ‘request’ procedures. Some sites allow you to commission custom deepfakes for £25, while on others you can upload a woman’s image and a bot will strip her naked.<br><br>“This violation is not something that should be normalised,” says Gibi, an ASMR artist with 3.13 million YouTube subscribers. Gibi has given up trying to keep tabs on the deepfakes of her. For Gibi, the most egregious part of all of this is the fact that people are “profiting off my face, doing something that I didn’t consent to, like my suffering is your livelihood.” She’s even been approached by a company offering to remove the deepfakes — for £500 a video. This has to end. But how?</i><br><br>A website hosting pornographic content provided users the ability to create deepfake content (T0154.002: AI Media Platform, T0086.002: Develop AI-Generated Images (Deepfakes)). <br><br>Another website enabled users to commission custom deepfakes (T0152.004: Website Asset, T0148.004: Payment Processing Capability, T0086.002: Develop AI-Generated Images (Deepfakes), T0155.005: Paid Access Asset). |
| [T0152.004 Website Asset](../../generated_pages/techniques/T0152.004.md) | IT00000365 <i>Last October, British writer Helen was alerted to a series of deepfakes on a porn site that appear to show her engaging in extreme acts of sexual violence. That night, the images replayed themselves over and over in horrific nightmares and she was gripped by an all-consuming feeling of dread. “It’s like you’re in a tunnel, going further and further into this enclosed space, where there’s no light,” she tells Vogue. This feeling pervaded Helen’s life. Whenever she left the house, she felt exposed. On runs, she experienced panic attacks. Helen still has no idea who did this to her.<br><br>[...]<br><br>Amnesty International has been investigating the effects of abuse against women on Twitter, specifically in relation to how they act online thereafter. According to the charity, abuse creates what they’ve called “the silencing effect” whereby women feel discouraged from participating online. The same can be said for victims of deepfakes.<br><br>Helen has never been afraid to use her voice, writing deeply personal accounts of postnatal depression. But the deepfakes created a feeling of shame so strong she thought she’d be carrying this “dirty secret” forever, and so she stopped writing.<br><br>[...]<br><br>Meanwhile, deepfake ‘communities’ are thriving. There are now dedicated sites, user-friendly apps and organised ‘request’ procedures. Some sites allow you to commission custom deepfakes for £25, while on others you can upload a woman’s image and a bot will strip her naked.<br><br>“This violation is not something that should be normalised,” says Gibi, an ASMR artist with 3.13 million YouTube subscribers. Gibi has given up trying to keep tabs on the deepfakes of her. For Gibi, the most egregious part of all of this is the fact that people are “profiting off my face, doing something that I didn’t consent to, like my suffering is your livelihood.” She’s even been approached by a company offering to remove the deepfakes — for £500 a video. This has to end. But how?</i><br><br>A website hosting pornographic content provided users the ability to create deepfake content (T0154.002: AI Media Platform, T0086.002: Develop AI-Generated Images (Deepfakes)). <br><br>Another website enabled users to commission custom deepfakes (T0152.004: Website Asset, T0148.004: Payment Processing Capability, T0086.002: Develop AI-Generated Images (Deepfakes), T0155.005: Paid Access Asset). |
| [T0154.002 AI Media Platform](../../generated_pages/techniques/T0154.002.md) | IT00000363 <i>Last October, British writer Helen was alerted to a series of deepfakes on a porn site that appear to show her engaging in extreme acts of sexual violence. That night, the images replayed themselves over and over in horrific nightmares and she was gripped by an all-consuming feeling of dread. “It’s like you’re in a tunnel, going further and further into this enclosed space, where there’s no light,” she tells Vogue. This feeling pervaded Helen’s life. Whenever she left the house, she felt exposed. On runs, she experienced panic attacks. Helen still has no idea who did this to her.<br><br>[...]<br><br>Amnesty International has been investigating the effects of abuse against women on Twitter, specifically in relation to how they act online thereafter. According to the charity, abuse creates what they’ve called “the silencing effect” whereby women feel discouraged from participating online. The same can be said for victims of deepfakes.<br><br>Helen has never been afraid to use her voice, writing deeply personal accounts of postnatal depression. But the deepfakes created a feeling of shame so strong she thought she’d be carrying this “dirty secret” forever, and so she stopped writing.<br><br>[...]<br><br>Meanwhile, deepfake ‘communities’ are thriving. There are now dedicated sites, user-friendly apps and organised ‘request’ procedures. Some sites allow you to commission custom deepfakes for £25, while on others you can upload a woman’s image and a bot will strip her naked.<br><br>“This violation is not something that should be normalised,” says Gibi, an ASMR artist with 3.13 million YouTube subscribers. Gibi has given up trying to keep tabs on the deepfakes of her. For Gibi, the most egregious part of all of this is the fact that people are “profiting off my face, doing something that I didn’t consent to, like my suffering is your livelihood.” She’s even been approached by a company offering to remove the deepfakes — for £500 a video. This has to end. But how?</i><br><br>A website hosting pornographic content provided users the ability to create deepfake content (T0154.002: AI Media Platform, T0086.002: Develop AI-Generated Images (Deepfakes)). <br><br>Another website enabled users to commission custom deepfakes (T0152.004: Website Asset, T0148.004: Payment Processing Capability, T0086.002: Develop AI-Generated Images (Deepfakes), T0155.005: Paid Access Asset). |
| [T0155.005 Paid Access Asset](../../generated_pages/techniques/T0155.005.md) | IT00000366 <i>Last October, British writer Helen was alerted to a series of deepfakes on a porn site that appear to show her engaging in extreme acts of sexual violence. That night, the images replayed themselves over and over in horrific nightmares and she was gripped by an all-consuming feeling of dread. “It’s like you’re in a tunnel, going further and further into this enclosed space, where there’s no light,” she tells Vogue. This feeling pervaded Helen’s life. Whenever she left the house, she felt exposed. On runs, she experienced panic attacks. Helen still has no idea who did this to her.<br><br>[...]<br><br>Amnesty International has been investigating the effects of abuse against women on Twitter, specifically in relation to how they act online thereafter. According to the charity, abuse creates what they’ve called “the silencing effect” whereby women feel discouraged from participating online. The same can be said for victims of deepfakes.<br><br>Helen has never been afraid to use her voice, writing deeply personal accounts of postnatal depression. But the deepfakes created a feeling of shame so strong she thought she’d be carrying this “dirty secret” forever, and so she stopped writing.<br><br>[...]<br><br>Meanwhile, deepfake ‘communities’ are thriving. There are now dedicated sites, user-friendly apps and organised ‘request’ procedures. Some sites allow you to commission custom deepfakes for £25, while on others you can upload a woman’s image and a bot will strip her naked.<br><br>“This violation is not something that should be normalised,” says Gibi, an ASMR artist with 3.13 million YouTube subscribers. Gibi has given up trying to keep tabs on the deepfakes of her. For Gibi, the most egregious part of all of this is the fact that people are “profiting off my face, doing something that I didn’t consent to, like my suffering is your livelihood.” She’s even been approached by a company offering to remove the deepfakes — for £500 a video. This has to end. But how?</i><br><br>A website hosting pornographic content provided users the ability to create deepfake content (T0154.002: AI Media Platform, T0086.002: Develop AI-Generated Images (Deepfakes)). <br><br>Another website enabled users to commission custom deepfakes (T0152.004: Website Asset, T0148.004: Payment Processing Capability, T0086.002: Develop AI-Generated Images (Deepfakes), T0155.005: Paid Access Asset). |