CSAM detection in iCloud Photos; is it a threat to privacy?

- August 10, 2021
| By : Nabeel Ahmed |

Apple’s latest addition in its software to “limit the spread of Child Sexual Abuse Material (CSAM)” has some users cheering it on, while others are concerned about its threat to privacy In recent months, Apple has been at the forefront when it comes to protecting its users privacy, so much so that the tech giant […]

Apple's introduction of scanners for CSAM has privacy watchdogs worried about the future of such systems on users' privacy PHOTO:Getty

Apple’s latest addition in its software to “limit the spread of Child Sexual Abuse Material (CSAM)” has some users cheering it on, while others are concerned about its threat to privacy

In recent months, Apple has been at the forefront when it comes to protecting its users privacy, so much so that the tech giant has been at loggerheads with Facebook over the former limiting cross platform and individual app tracking on its devices. Now, however, with Apple announcing that there will be CSAM detection in iCloud images and iMessages, users are concerned about their privacy, and for good reason. 

On first glance the addition of the features and its implementation seem like a backdoor to users’ iCloud and iMessage images through which the company will be screening images for Child Sexual Abuse Material. On its part Apple has clarified that only users who are using iCloud image with family accounts and iMessage to share pictures will be screened. “Communication safety in Messages is only available for accounts set up as families in iCloud. Parent/guardian accounts must opt in to turn on the feature for their family group. Parental noti- fications can only be enabled by parents/guardians for child accounts age 12 or younger” the company revealed in a document explaining the need, use and limitations of the CSAM feature. 

In the document Apple also clarified that the feature cannot be used to screen users images and messages on their native devices and that it will notify parents/ guardians about CSAM images on child accounts only after warning the child account beforehand. To further reassure users against the company gaining access to their images and messages or the system being used to gain access to data other than CSAM material, the tech giant said that “Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations”.

The document also said that authorities will only be notified after the CSAM system has detected explicit material, the user notified and the images cross checked by a human reviewer. Therefore the chances of misreporting will be negligible. However, since even possession of such material is a punishable offence in many countries, after a human review the account will be reported to the concerned authorities. 

 

So why are privacy watchdogs edgy about the new feature?

CSAM protection systems are not new, and tech companies like Google have been using them for quite some time, so why is it that Apple’s implementation of the features is under scrutiny?

For starters, Apple has been the torch bearer of user privacy for some time now, and its decision to scan iCloud images and iMessages for CSAM comes as a surprising move amidst its ongoing privacy war with Facebook.

Secondly, scanning images and messages using algorithms is all good, however, human review of flagged content and concerns over whether the system is fool proof or not — not just now but also in the future — has users concerned. 

Lastly and most importantly, we live in a day and age where Pegasus has dominated news and tech companies are under pressure across the globe to either help governments snoop on individuals or provide assistance with the same. In such a scenario the expansion of Apple’s CSAM technology to not just gain access but to target individuals speaking up against their government is a big risk. 

So far Apple has stood by their users in ensuring that the company is not pressurized into divulging information or assisting in breaking their encryption that protect Apple devices, however how long can this last and how far is the company willing to go to ensure that its CSAM protection is not used to threaten the privacy of users is something only the future can tell. 

For now all we can hope is that additional features does its job, helps identify and nab predators, and Apple continues to fight off requests and pressure from governments to compromise on its users’ privacy. 

 

For more stories that cover the ongoings of Delhi NCR, follow us on: