Credit: Patrick Busslinger / Shutterstock

DEBATE: Apple: We’re going to scan your devices for Child Porn

Here it is folks: The perfect story to determine whether you really care about privacy and freedoms and the danger of big corporate power, or whether you are just fine with companies monitoring our every move, so long as it targets people we don’t like:

Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused, including by governments looking to surveil their citizens.

The tool designed to detected known images of child sexual abuse, called “neuralMatch,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Before we discuss the rights and wrongs of all this, here is the Apple Podcast, explaining exactly how this will work. It’s rolling out in the USA first, naturally. But you can expect to see it here soon, no doubt:

So, right, or wrong?

Wrong, for my money, and it is not particularly close. Tim will be along later or tomorrow to explain why he thinks this is welcome, for those of you of that bent, but for me, this is a major step in the wrong direction.

For starters, law enforcement is a Government matter. We do not expect, nor should we, tech companies to take it upon themselves to act as the police. The slippery slope here is obvious, glaring, and, worst of all, inviting. If Apple can crack down on child porn, they can also crack down on almost anything else. Since we do almost everything through our phones these days, they have the ability to monitor our purchases, our private conversations, our web browsing history, and (as we see here) our photographs. Obviously, nobody is going to disagree that Child Pornography, and the Child Abuse required to create it, is an abomination – but that is also exactly why this is the perfect issue for Apple, if they want to start enhancing their ability to monitor their own customers. Who is going to object, after all, if a big company wants to voluntarily catch paedophiles?

Second: Algorithms are imperfect. Apple, for example, says that it will ensure that all images which are “flagged” by its system will be reviewed by a human being. But human beings are imperfect, too. What is to say, for example, that a wife who sends a photograph of her baby being bathed, to her husband, who is away on business (or vice versa) won’t end up being flagged as a potential paedophile both by the Apple algorithm, and by some low-paid graduate who spends his days looking at child porn, and has become a little cynical? It is not hard to see how lives could be ruined here, on foot of a mistake. Apple says that won’t happen – but you would have to be very trusting indeed to believe them.

Third: Where does the right of privacy begin, let alone end? Phone companies these days provide us with all sorts of tools – Fingerprint scanning, patterns, passwords, and so on – to stop other people snooping through our phones. What is the point of those, if Apple employees can snoop through our phones for us? Lots of people may have very private, very embarrassing, very personal, or very important information on their phones that they wish to keep private. Are they entitled to that privacy, or not? If Apple can monitor their phones, for example, what is to stop that technology being used by the state, or other actors, to blackmail people – or even, less dramatically, to advertise to people? If Apple is willing to flag child porn to the Government, why would it not be willing to flag, say, regular adult prostitution? Or Drug use? Or fraud? Or racism, or hate speech?

Imagine, for example, that a company adopted a “no racism” policy for its employees, but also supplied them with Apple phones. Do we really think that it would be massively unpopular for that company to do a deal with Apple whereby Apple monitored phones for racist content, or things that might be considered hate speech? That might seem like a stretch today – but what’s the limiting principle? Child Porn is bad, but so, obviously, is racism.

The bottom line is that Child Pornography – something we all hate – is being used here to set a very dangerous precedent. Slowly, but surely, companies like Apple are supplanting roles that used to be reserved exclusively for Governments. There is no constitutional limitation, internally, on what they can do. There are no elections if you do not like the results. More and more, we are putting ourselves at their mercy, and doing it entirely willingly. Pardon me, if I do not cheer.

 

TIM JACKSON HAS AN ALTERNATIVE VIEW HERE: 

DEBATE: “Right to privacy” doesn’t extend to viewing child porn

Share mdi-share-variant mdi-twitter mdi-facebook mdi-whatsapp mdi-telegram mdi-linkedin mdi-email mdi-printer mdi-chevron-left Prev Next mdi-chevron-right Related
Comments are open

Do you agree with Senator Keogan that people on long-term unemployment benefit should have to do community service for the money?

View Results

Loading ... Loading ...