What are deepfakes and why do they target women?




Deepfakes are digitally altered images, audio, or videos created using AI that appear as though someone has said or done something they never actually did. While the technology can be used for entertainment or creative purposes, deepfakes are increasingly misused as a form of digital abuse – for example, to create non-consensual sexual images, spread disinformation, or damage a person’s reputation.

Deepfakes are increasingly and overwhelmingly targeting women. Laura Bates weighed in on why – “In part, this is about the root problem of misogyny – this is an overwhelmingly gendered issue, and what we're seeing is a digital manifestation of larger offline truth: men target women for gendered violence and abuse.”

“But it's also about how the tools facilitate that abuse”, adds Bates.

AI technology has made the tools user-friendly and one doesn’t need much technical expertise to create and publish a deepfake image or video. In this context, the rise of "sextortion" using deepfakes – in which non-consensual, fabricated images are shared widely on pornographic sites to harass women – is a growing concern.

AI-generated deepfake pornographic images, once disseminated online, can be replicated multiple times, shared and stored on privately-owned devices, making them difficult to locate and remove.



Technology-facilitated violence isn't just about what happens on screens. What happens online spills into real life easily and escalates. (Stock photo posed by model). 

Comments

Popular posts from this blog

What can you do to counter digital violence?

UNiTE to End Digital Violence against All Women and Girls.

16 Days of Activism on Ending Violence Against Women and Girls.