Each year, Pornhub – a porn site with more visitors than Netflix – releases its stats in a data science review that gives a fascinating glimpse into the psyche of the users. This year, their biggest trending search was “reality”, with viewers seeking out what the site describes as “a real homemade porn experience”. But not everyone is looking for the “real” deal. In another corner of the Internet, users can browse for the most outlandish, fantastical and downright disturbing content possible and – thanks to generative AI tools that take a text prompt and produce images from it – have it customised to their own preference.
A dive into this murky world by investigative tech journalists at 404 Media made for somewhat grim reading. The idea of computer-generated personalised fantasies could be a beautifully niche exploration of an individual’s sexuality… but, eh, no. Not in this case. Instead, users were generating scenes that wouldn’t look out of place in a horror film. And in some cases, they were making deepfakes – images using the faces of celebrities, influencers, or simply anyone with a social media presence.
Deepfakes have been around for some time, but it’s the ease and scale at which they can now be generated that makes this so alarming. No matter your stance on porn – whether you’re a hard-line abolitionist or someone pro its existence but against exploitation – this is nasty. It’s non-consensual in terms of those who, unwittingly and unwillingly, have had their likeness taken, and those who work in the industry and who have had their livelihoods stolen.
Censorship is often doomed to failure, driving behaviour underground and making it harder to debate
With more mainstream generative AI tools like ChatGPT or Midjourney, there are barriers and filters that aim to stop the creation of explicit material. They don’t always work, and sometimes they depend on humans to screen particularly horrible content (and that’s a whole other story about hidden labour). But you don’t have to go far to find these unrestricted versions. They aren’t hidden away on some apocryphal Dark Web.
In the UK, there are already laws in place that address some of this: since 2015 in England and Wales, and 2016 in Scotland and N Ireland, it has been illegal to distribute intimate images without consent. Sexualised images of children – even if those images are synthetic – have been an offence since 2009. Companies have been trying to shut down this material, but not always motivated by ethics. One such site, Mage Space, began to block sharing, responding with the message: “This request was denied due to its high likelihood for abuse and to protect our community. Non-consensual imagery with celebrity names is not allowed on Mage.” (In other words: “Please don’t share this or we’ll get sued by the rich.”)
Life was simpler when we found our porn in hedges. The hedge porn fable has some basis: academic and sex researcher Kate Lister polled Twitter asking for people’s experience of encountering abandoned or hidden pornographic material in the great outdoors and, of the 2,254 people who filled out the survey, 1,944 (86.2 per cent) confirmed they had found some – most of them GenX-ers as kids. The porn found in outdoor locations was generally magazines – Razzle, Fiesta, Playboy – predominately softcore, but occasionally a stash that was much more niche. As Lister reports in iNews, “Kids have been viewing porn long before the internet made hedge stashes redundant. This being the case, I would suggest what is needed is not necessarily efforts focused on more censorship, but on education.”
Lister has a valid point: censorship is often doomed to failure, driving behaviour underground and making it harder to debate. Academic studies down the years have tried to prove or dispute a link between sexually explicit material and anti-social or violent behaviour. No one has succeeded. A claim by one side is easily refuted by a claim from the other. Porn remains a subject where arguments are made with hearts, not minds. But most agree that there is harm when there is no consent, and AI-generated porn is non-consensual when it takes copyrighted content or, more worryingly, when it fakes identities. Is the problem the mechanics of the sexual acts? Or is it that it has a) people who did not agree to be in it and b) people who do not want to see it?
I’m not arguing that porn is inherently wrong, nor that AI is inherently bad. If anything, I have theoretically no problem with either. But both of these things are so much more complex, more nuanced and more subject to human projection than we might care to acknowledge. Combined, they become more powerful than either ought to be. A moratorium on AI is likely to be about as successful as a moratorium on porn, so for now we’re stuck with both, picking our way through the caveats, hoping we can control what we’ve created.
Kate Devlin is an academic and author of “Turned On: Science, Sex and Robots” (Bloomsbury, 2018)