Breaking here's the deal....Apple, after all their great work recently, on enhancing their privacy offer, have now spectacularly and very publically shot themselves in the proverbial foot with their latest tool release.

Full story revealed within the tweetstorm reproduced here but think on this as you go through do you know what images are being used as the checks? What's to stop a bad actor substituting in something political? A shot of Winnie the Pooh to catch out a Chinese dissident? A party logo or slogan to identify resistors? None of us support child abuse, but you can't fight it at the risk of propagating other abuses! 'Nice try' Apple, but have another bite at this one!  

Edward Snowden (@Snowden)
Apple plans to modify iPhones to constantly scan for contraband: “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of our phones and laptops,” said Ross Anderson, professor of security engineering.
Matthew Green (@matthew_d_green)
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
I’ve had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
These tools will allow Apple to scan your iPhone photos for photos that match a specific perceptual hash, and report them to Apple servers if too many appear.
Initially I understand this will be used to perform client side scanning for cloud-stored photos. Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems.
The ability to add scanning systems like this to E2E messaging systems has been a major “ask” by law enforcement the world over. Here’s an open letter signed by former AG William Barr and other western governments.
Attorney General Barr Signs Letter to Facebook From US, UK, and Australian Leaders Regarding Use of End-To-End Encryption
The Department of Justice today published an open letter to Facebook from international law enforcement partners from the United States, United Kingdom, and Australia in response to the company’s publicly announced plans to implement end-to-end-encryption across its messaging services.
This sort of tool can be a boon for finding child pornography in people’s phones. But imagine what it could do in the hands of an authoritarian government?

The way Apple is doing this launch, they’re going to start with non-E2E photos that people have already shared with the cloud. So it doesn’t “hurt” anyone’s privacy. But you have to ask why anyone would develop a system like this if scanning E2E photos wasn’t the goal.
But even if you believe Apple won’t allow these tools to be misused there’s still a lot to be concerned about. These systems rely on a database of “problematic media hashes” that you, as a consumer, can’t review.
Hashes using a new and proprietary neural hashing algorithm Apple has developed, and gotten NCMEC to agree to use. We don’t know much about this algorithm. What if someone can make collisions?
Imagine someone sends you a perfectly harmless political media file that you share with a friend.  But that file shares a hash with some known child porn file?
These images are from an investigation using much simpler hash function than the new one Apple’s developing. They show how machine learning can be used to find such collisions.
Black-Box Attacks on Perceptual Image Hashes with GANs
tldr: This post demonstrates that GANs are capable of breaking image hash algorithms in two key ways: (1) Reversal Attack: Synthesizing the original image from the hash (2) Poisoning Attack…
The idea that Apple is a “privacy” company has bought them a lot of good press. But it’s important to remember that this is the same company that won’t encrypt your iCloud backups because the FBI put pressure on them.
Exclusive: Apple dropped plan for encrypting backups after FBI complained - sources
Apple Inc <AAPL.O> dropped plans to let iPhone users fully encrypt backups of their devices in the company’s iCloud service after the FBI complained that the move would harm investigations, six sources familiar with the matter told Reuters.
WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan
Apple’s announcement of new child safety measures sparked discussion and debate in the tech community. WhatsApp’s lead said it won’t adopt a similar approach, calling it “an Apple built and operated surveillance system.”

Apple to scan iPhones for child sex abuse images
Apple announces it will implement a system to scan US iPhones for images of child sexual abuse.
Apple Open to Expanding New Child Safety Features to Third-Party Apps
Apple News

Share this post