Tuesday, November 26, 2024
HomeBeauty & FashionDeepfake expertise is a risk to all girls – not simply celebrities

Deepfake expertise is a risk to all girls – not simply celebrities


“Deepfake sexual abuse is commonly about trying to silence women who speak out,” says Durham College Professor of Regulation Clare McGlynn, a number one authority on deepfake legal guidelines and one among GLAMOUR’s official marketing campaign companions. “We see this with Taylor Swift. We see this with women politicians, where deepfake porn is an attempt to intimidate them. We see it with many women in the public eye.”

Amanda Manyame, Equality Now’s Digital Rights Advisor, who works on the intersection between tech and the regulation, agrees that ladies within the public eye are at a specific danger of deepfake abuse. She tells GLAMOUR, “Anyone can be a victim of deepfake imaged-based sexual abuse, but women in the public eye and positions of authority – such as celebrities, politicians, journalists, and human rights defenders – are particularly targeted.”

However seeing high-profile girls victimised on this means additionally has a profound impression on common girls and ladies. When Ellie Wilson, an advocate for justice reform, tweeted concerning the troubling response to the deepfakes of Swift, she was met together with her personal flurry of on-line abuse. “People threatened to make similar deepfake images of me,” she tells GLAMOUR. “These attacks for merely stating my opinion highlight just how dangerous it is for women to simply exist on the internet.”

Olivia DeRamus, the founder and CEO of Communia, a social community created by and for ladies, notes that even talking up in opposition to deepfaking places different girls in peril. “Just talking about [deepfaking] as a woman paints a target on my back, along with other advocates, the female journalists covering this, the #swifties speaking out, and even female politicians who want to tackle the issue.”

Professor Clare McGlynn emphasises that deepfaking represents a risk to all girls and ladies, citing the “potentially devastating impact on our private and professional lives.”

It is clear that deepfake expertise is quickly hurtling uncontrolled. Amanda Manyame cites “rapid advances in technology and connectivity” that make it “increasingly easy and cheap to create abusive deepfake content”. She provides, “Cyberspace facilitates abuse because a perpetrator doesn’t have to be in close physical proximity to a victim.

“In addition, the anonymity provided by the internet creates the perfect environment for perpetrators to cause harm while remaining anonymous and difficult to track down.”

Moreover, most countries are ill-equipped to deal with tech-facilitated harms like deepfaked image-based abuse. Until recently in the UK, it was an offence – under the Online Safety Act – to share deepfake pornographic content without consent, but it failed to cover the creation of such images. “This gap,” Manyame explains, “created an enabling environment for perpetrators who know they are unlikely to be discovered or punished. The situation is worsened by the lack of legal accountability governing the tech sector, which currently does not have to ensure safety by design at the coding or creation stage.”

Meanwhile, the tech sector itself is alienating victims. As Manyame tells GLAMOUR, “Content moderation on tech platforms relies primarily on reporting by victims, but reporting mechanisms are generally difficult to use, and many platforms frequently do not respond to requests to remove abusive content or only respond after a long time.”


What is the law on deepfakes in the UK?

Under a new law announced on 16 April 2024, those who create sexually explicit deepfakes will face prosecution – thanks to an amendment to the Criminal Justice Bill. The new offence, proposed by Conservative MP Laura Farris and the Ministry of Justice, will be punishable with an unlimited fine and a criminal record.

This a welcome update to the previous legislation, which criminalises the distribution or sharing of deepfake porn. Offenders could face prison time for sharing an explicit deepfake image online or with others.


Can anything else be done about deepfake technology? The Online Safety Act now criminalises the sharing and the creation of non-consensual deepfake pornography, which could, as Sophie Compton, co-founder of #MyImageMyChoice, a movement tackling intimate image-based abuse, tells GLAMOUR, create “greater accountability for tech companies.” Whether this legislation will be effective is another story.

She points out that search platforms drive plenty of traffic to deepfake pornography sites – can the Online Safety Act clamp down on this? “The government needs to tackle Big Tech and their role in promoting and profiting off deepfake abuse, and get the sites and web services that are profiting off of abuse blocked from the mainstream internet.”

Professor Clare McGlynn from Durham University notes that while the Online Safety Act has the potential to tackle deepfake pornography, “There is a real risk that the legislation is a damp squib, all rhetoric and little change.” She points out that Ofcom, the UK’s communications regulator, is currently consulting on the guidance it will use to enforce the Act. “Ofcom needs to challenge the social media companies to make a step-change in their approaches […] It should focus on proactive regulation being human-rights-enhancing. It can enable women to live freer lives online.”

Ultimately, though, we need to address the misogynistic culture that empowers users to create harmful, non-consensual content of women. Helen Mort survived being deepfaked – yet she asks, “What are the cultural and social factors that make people abuse images in this non-consensual way?”

We’re still looking for answers.

GLAMOUR is campaigning for the government to introduce an Image-Based Abuse Bill in partnership with Jodie Campaigns, the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn.

If you have had your intimate images shared without your consent, remember that you are not alone, and there is help available. Get in touch with the Revenge Porn Helpline at help@revengepornhelpline.org.uk. There is also a step-by-step guide on notyourporn.com, which should be followed before taking any action.

For more from Glamour UK’s Lucy Morgan, follow her on Instagram @lucyalexxandra.



RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments