Apple is rolling out some new safety features on iOS that aim to provide more security for young users. However, one of those tools has caught instant heat because it scans every photo for evidence of child abuse when a user uploads it to iCloud. Any instance of child abuse found, and Apple automatically reports the user to the police.
Of course, scanning content for Child Sexual Abuse Material (CSAM) is not new. Many cloud storage services already do this, but Apple has not added the feature until now because the company was concerned about breach of privacy.
Cupertino thinks it has solved this issue and is launching the feature. However, many commentators think Apple has not found a way to bypass privacy concerns and the new tool is in fact breaching user privacy.
One of those is the Electronic Frontier Foundation (EFF), which published a post slamming Apple. The EFF argues Apple now has a backdoor into the private lives of users. Moreover, the company will act by contacting police without the user ever knowing. EFF says this is not good enough and compromises overall privacy of iPhone and iPad users:
“Child exploitation is a serious problem, and Apple isn’t the first tech company to bend its privacy-protective stance in an attempt to combat it. But that choice will come at a high price for overall user privacy. Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor, but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.
To say that we are disappointed by Apple’s plans is an understatement… We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.”
This is clearly a sensitive issue that once again raises the question is compromising privacy worth it if its for some greater good? Some would argue yes, but the EFF and other critics suggest that Apple’s stance, and others like it, do not necessarily contribute to the greater good.
In response, Apple has sent a letter internally that defends the new tools:
“Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years. We would not have reached this milestone without your tireless dedication and resiliency.
Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple’s deep commitment to user privacy.
We’ve seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we’ve built. And while a lot of hard work lays ahead to deliver the features in the next few months, I wanted to share this note that we received today from NCMEC. I found it incredibly motivating, and hope that you will as well.
I am proud to work at Apple with such an amazing team. Thank you!”
Tip of the day: Though many VPN providers have their own apps, you can in many cases connect to a VPN in Windows 10 without any third-party software. This is ideal if you have a self-hosted VPN or if you’re using a PC with restricted permissions. In our tutorial, we’re showing you how to connect to a VPN in Windows 10.