With increasing abilities of devices, thanks to their increasingly powerful hardware, and also with the more reliable internet, smartphones have become the powerful computers people have always dreamed of.
In the modern days of computing and internet, people can do practically anything with their smartphones.
With these modern devices connected to the internet 24/7, data can flow behind people's back. Not many people know which of their data is being analyzed, shared or sent.
And this time, Apple announced its plan to scan every single picture its iPhone users have.
One day after the announcement, WhatsApp head took to Twitter, and called the move a “surveillance system” that could be abused.
Will Cathcart criticized the decision, because he thinks the approach could also be used to scan private content for anything that Apple or a government decided that it wanted to control.
Cathcart also specifically referenced China as one place where such technology could be abused.
I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world.
People have asked if we'll adopt this system for WhatsApp. The answer is no.— Will Cathcart (@wcathcart) August 6, 2021
Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world.
— Will Cathcart (@wcathcart) August 6, 2021
Apple announced that it would roll out a software update through iOS 15, that would target three main areas to help keep children safe from exploitation and abuse.
The company has faced criticism for the decision.
While the specific target of the move is generally praised, the implications of the AI and algorithm used for the system is worrying.
In short, many argue that private corporations should not be responsible for acting as watchdogs.
“Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven’t shared with anyone. That’s not privacy,” Cathcart continues.
We’ve had personal computers for decades and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.
— Will Cathcart (@wcathcart) August 6, 2021
This is kind of a contrast to Apple's previous approach, which requires developers to ask users for their consent before they allow their unique identifiers to be collected and shared.
Introduced through iOS 14, the feature annoyed developers and marketers, but widely praised by privacy-concerned individuals.
This approach propelled Apple, that even its rival Google, tries to copy the approach.
Blocking trackers annoyed Facebook, which went as far as sending users notifications through its apps, including Instagram, to try and sway users to allow them to continue to allow app tracking.
Facebook has repeatedly criticized Apple for interfering with Facebook’s ability to track users and target ads, and has called Apple one of its biggest competitors, going so far as to accuse Apple of using its dominant platform position to push its own apps over those like Facebook.
But this time, Apple is like backtracking, and reverting what it did.
Over the past few years, Apple has positioned itself as a champion of user privacy and security, investing in advanced hardware and software features that forward those goals.
While this doesn't concern trackers and instead focuses on the well-being and the safety of people, in privacy perspective, Cathcart considers Apple's approach a violation and an abuse.
Critics argue Apple's new Child Safety tools will tarnish that reputation.
“Can this scanning software running on your phone be error proof? Researchers have not been allowed to find out,” he wrote. “Why not? How will we know how often mistakes are violating people’s privacy?”
Read: To Be Tracked Or Not? Starting IOS 14.5, Apple Gives Users The Choice
There are so many problems with this approach, and it’s troubling to see them act without engaging experts that have long documented their technical and broader concerns with this.
— Will Cathcart (@wcathcart) August 6, 2021