As Fatima discussed yesterday, the current moral panic amongst all decent Irish people is the fact that Elon Musk’s x-embedded AI tool, Grok, has been facilitating the production of what it is these days polite to refer to as “child abuse imagery”. Or in layman’s terms, it has been co-operating in public with requests by creepy individuals to “remove” children’s clothes from images and create new images with the child dressed only in a skimpy bikini.
This is what was, a few years ago, called AI “deepfake” porn, and it is not new. This has happened to almost every famous actress you can think of, for example, except in their case being dressed in a bikini was the mildest thing to happen to them.
The outrage has been as predictable and kneejerk as you would expect: A legion of politicians and lobby groups – most of them ideologically committed to doing anything they can to damage Elon Musk – have jumped on the story with the ferocity of a 50’s era Catholic Priest hearing that Edna O’Brien books are freely being circulated in his parish. There have been calls for criminal investigations, regulatory actions, and so forth.
A few things here: First, I was writing about Musk’s twitter porn problem long before this stuff became mainstream. Frankly, I find the sudden expressions of political concern baffling: This is a platform on which children can freely, if they know where to look, find uncensored videos of the above-pictured Bonnie Blue, Lily Phillips, and a host of other “adult stars” doing their jobs in graphic detail. Their accounts are verified. A social media platform where the most hardcore pornography imaginable can sit on your timeline beside some chud’s philosophical musings is hardly a shocking venue for depravity. It’s like complaining about finding cocaine and viagra in a brothel.
Second, the problem with artificial intelligence is and was always going to be one of morality, and because it is a human construct, sexual content was always going to be one of its most profitable uses. We have already seen, over the past few years, the rise of “AI girlfriends” and “AI boyfriends” and so on and so forth. In other parts the internet, there has existed for years the infamous “rule 7 of the internet”, which apocryphally reads “if it exists, there is porn of it”.
The notion that Artificial Intelligence can be trained to recognise the age of human beings reliably, simply from images, is ludicrous. Some early teenage children look like they are 25, and some 25 year olds look like teenagers. To the extent that AI is empowered to manipulate images at all, there is always the in-built risk that children will be involved. That is not simply common to Grok, but would be to every AI platform with such capabilities. There’s a reason “I genuinely thought she was 18, yerhonor” is an actual legal defence in many jurisdictions against statutory rape charges.
Which poses the question: What do our politicians actually want? Clearly, they do not wish for a porn-free internet, or a porn-free social media space. Given that they have no publicly expressed difficulty with free (literally) access to hardcore porn in those spaces, we must surmise that child porn (sorry, child abuse images) is where they draw the line.
Interesting morality, isn’t it? We’re okay with actual flesh and blood women being sexually exploited in historically grotesque ways on camera, but we draw the line at the artificially created image of a twelve year old in a bikini.
The problem, of course, is that one naturally follows the other. The sexually addicted mind that can no longer resolve its addictions with a video of Bonnie Blue simultaneously servicing twenty five men is always going to look for something harder, more raw. Go down to the criminal courts and you’ll see the same story repeated several times a month: The lad found in possession of a few hundred “images” whose barrister stands up to tell the court that he was in a bad place, suffering from depression, and “fell into a hole”. The implication is that he didn’t start with the kiddie stuff, but that he ended up there. I’m no expert, but those defences are some of the relatively few criminal defence mitigations that I tend to instinctively believe to be true, at least in a lot of cases.
So here we have the politicians, many of whom have their own motives, riding into declare that bikini-adorned 12-year-olds are where we draw the line. Gary Gannon – a TD I have great respect for, incidentally – announced his departure from X forever because he simply couldn’t share a platform with such filth. I get that. I don’t doubt his stance will be popular. But Gary: You were happy enough to share it with Bonnie Blue? (Recent tweet: “New Year, New Holes to be unlocked. #bonniebackblowout” 656,000 views).
The problem here is that you can’t solve the child porn issue without solving the porn issue. And our politicians are terrified of touching the porn issue, so we’ve invented this bizarre Chinese wall whereby it’s perfectly legal to show an 18 year old actual human female being sexually exploited on social media on her eighteenth birthday, but an AI image of her literally the day before would be illegal.
How is artificial intelligence supposed to figure all that out? AI operates by rules of logic and discernment. Humans, by contrast, operate by almost literally anything else.
Yes, twitter is a cesspit of immorality. So is TikTok. So is snapchat (where most of the actual child abuse images circulate, apparently, often amongst the children themselves).
Liberals created this world. We’re all now forced to live in it. If it were up to me, I’d just ban all nudity on social media apps, and have done with it. At least then the AI would have a consistent rule under which it could operate.