Let me make this clear! Yet Another Apple CSAM Post by The New Oil's Nate Bartram

If you’ve been on the internet in the last two weeks, you’ve possibly heard about Apple’s new Child Sexual Abuse Material (CSAM) countermeasures. What you’ve heard – depending on where you’ve heard it – can range from “OMG Apple is gonna scan everything on your phone and manually review it and tell your parents!” to “Just turn off iCloud.” With so many articles, do we really need another one? Maybe not. But it’s always good to have more correct information out there to try and drown out the sensationalism, and I’ve had people reach out to me to ask for my personal opinion. So let’s talk about Apple’s CSAM measures and dispel some of the common myths I’ve seen.

What are Apple’s CSAM Countermeasures?

If you’ve somehow missed this story, here’s what’s happening: Apple recently announced three features that will premiere in iOS 15 in order to help combat sexual abuse of minors. The most discussed feature is NeuralHash. The way this works is by using hashes. Without getting too technical for our non-techie readers, a hash is a unique digital interpretation of an image or text. For example, “Decentralize Today” when hashed using SHA1 becomes “8bf4a168d6dc34f42e9dab0f3872386ea1fdbf13”. Comparing hashes is a popular technique used in cybersecurity to ensure that a program you’ve downloaded has not been altered or corrupted. It’s also very common in the fight against child sexual abuse. The National Center for Missing and Exploited Children (NCMEC) keeps a hash list of known child sexual abuse images.

So the big move that has everyone worried is that Apple wants to start checking iPhones for known child sexual abuse material. The way they’ve decided to do that is that iOS 15 will include this hash list from NCMEC locally. Any time you upload a photo to iCloud, before it gets uploaded, the images you upload will be hashed and compared against that list on your device. To protect against false positives, you must have a certain number of hits before you get flagged and reported (Apple has not disclosed the number in order to prevent people from attempting to circumvent the system). Once you’re flagged, a human will verify that the flags are correct and that it’s not just an incredibly unlucky number of false positives, and if it is indeed CSAM then they will alert authorities and pass it along to them.

Two other less talked about features that are often getting confused with NeuralHash are changes to Siri and Search, and using machine learning to scan your iMessages. With the Siri and Search changes, Apple will provide resources to people who make certain searches. For example, someone who searches “how to report child exploitation” will get a pop up with resources recommended by Apple. Likewise, anyone who seems to be searching for CSAM themselves will get a pop up reminding them that this is illegal and directing them to get professional help. The iMessage feature requires a phone to be set up as a child’s phone by a parent with parental controls. When this happens, local machine learning will attempt to detect any illicit images being sent via iMessage. If an inappropriate message comes in, it will show up blurred and warn the recipient that this message may be sensitive. If they do decide to view it, the parents will be alerted.

Clearing up Concerns

I’ve seen a lot of blatant misinformation flying around about this topic, which should really come as no surprise considering that, as Winston Churchill put it, “a lie gets halfway around the world before the truth has a chance to get its pants on.” The most common misconception is that Apple will now be scanning all the photos on your device. This is not true, Apple will only scan photos that get uploaded to iCloud. Others have said that Apple will be scanning your messages. This is also untrue, as the machine learning that will analyze the images is local and does not submit any information back to Apple. A final one is that this somehow constitutes a backdoor into iPhones, as if those aren't already possible simply by being a proprietary system.

That’s not to say that this move is without concern and we should all just accept it and move on. The most salient and often cited reason for concern is the potential for abuse. Say I’m an authoritarian government, and I have a picture of a protester. It’s possible that I can push out a hash of that photo to the various iPhones in my country and see a list of everyone else who has a copy of that photo. That may even lead me to the person who took the photo or people who were present at the protest, giving me a lot of dissidents to crack down on and maybe even leading me to the person in question. Another YouTuber made a pretty valid point about how the machine learning and parent alerts on iMessage could out LGBTQ teens to their parents, and unfortunately parents kicking out and disowning their own children for being LGBTQ is not uncommon. Others have pointed out that this measure, while well-intentioned, does nothing to crack down on the people actually producing CSAM, only those spreading it around. That’s still important but it fails to really get to the root of the sickness.

The Verdict

These kinds of moves are always tricky. On the one hand, despite constantly being used as a boogey-man behind which the state can hide while it violates our privacy, things like child sexual abuse and drug trafficking are real problems that should be addressed, and there’s no reason we shouldn’t use the new tools at our disposal to do so. Except, of course, for the invasion and risk of abuse that a tool like this very much presents. Good guys will always operate a disadvantage to bad guys, because bad guys by definition don’t worry about things like laws or ethics. Good guys will always have the challenge of stopping the bad guys while not betraying the very values they’re attempting to uphold.

Personally, I think overall this specific move isn’t the worst idea someone could’ve come up with. My concern, like other experts, is the potential. The slippery slope is a fallacy we cite a lot in the privacy community, but it’s not without precedent. Cory Doctorow calls this the “shitty tech adoption curve” - the process by which oppressive technologies become normalized. Take for example, Baltimore’s recent “Winter Soldier” plane, as I like to call it. The aerial surveillance plane flew nearly 24/7 and vacuumed up footage of a 32-square-mile section of city, footage that would later be used to check for criminals, crimes, or other direct evidence. However, the program was adapted for use in America from surveillance drones used in Afghanistan in combat. This is just one example of many. The slippery slope isn’t just a fallacy, it’s a real thing. The idea that these systems – both the hash list and the machine learning – could be abused is not without merit. This is what concerns me most: the abuse potential.

So what do I recommend? I think a temporary measure is to, of course, disable iCloud and not use iMessage, which are recommendations I would make to everyone anyways. For the more tech savvy, I think now more than ever is the time to look into alternative phones: Calyx, Graphene, PinePhone, even Librem (if you can get your hands on one). But even for the less tech savvy, I think the time is now to learn. I don’t believe that everyone should “learn to code.” I’ve seen some people suggest – dead seriously – that the solution to the privacy situation we face right is now is basically to raise all of our kids to be developers who can build and host all their own stuff from scratch. That’s ridiculous. But at the same time, we are moving into a world where the standard of tech literacy is rising constantly. Just as it is no longer socially acceptable to need help setting up your email or a new computer or even WiFi, we are moving into a world where the “baseline” for technical competency is rising, and we must now become comfortable with things like flashing custom ROMs, using Linux, or other things that were advanced just a few years ago. Maybe I’m wrong. Maybe I’m being extremist. But it seems like the only logical progression. We’ve seen this repeated in human history: once everyone knew how to read, formal education became the measure. Once a high school diploma became common, a college degree was required. The floor is rising again, and for better or worse I think we’re only cheating ourselves by failing to rise with it.

So now you know...bought to you courtesy of decentralize.today's new series 'Let Me Make This Clear!'

You can catch up with Nate and find more of his work on his website at thenewoil@xyz

Share this post