Categories: Tech

CSAM detection in iCloud Photos; is it a threat to privacy?

Published by
Nabeel Ahmed

Apple’s latest addition in its software to “limit the spread of Child Sexual Abuse Material (CSAM)” has some users cheering it on, while others are concerned about its threat to privacy

In recent months, Apple has been at the forefront when it comes to protecting its users privacy, so much so that the tech giant has been at loggerheads with Facebook over the former limiting cross platform and individual app tracking on its devices. Now, however, with Apple announcing that there will be CSAM detection in iCloud images and iMessages, users are concerned about their privacy, and for good reason. 

On first glance the addition of the features and its implementation seem like a backdoor to users’ iCloud and iMessage images through which the company will be screening images for Child Sexual Abuse Material. On its part Apple has clarified that only users who are using iCloud image with family accounts and iMessage to share pictures will be screened. “Communication safety in Messages is only available for accounts set up as families in iCloud. Parent/guardian accounts must opt in to turn on the feature for their family group. Parental noti- fications can only be enabled by parents/guardians for child accounts age 12 or younger” the company revealed in a document explaining the need, use and limitations of the CSAM feature. 

In the document Apple also clarified that the feature cannot be used to screen users images and messages on their native devices and that it will notify parents/ guardians about CSAM images on child accounts only after warning the child account beforehand. To further reassure users against the company gaining access to their images and messages or the system being used to gain access to data other than CSAM material, the tech giant said that “Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations”.

The document also said that authorities will only be notified after the CSAM system has detected explicit material, the user notified and the images cross checked by a human reviewer. Therefore the chances of misreporting will be negligible. However, since even possession of such material is a punishable offence in many countries, after a human review the account will be reported to the concerned authorities. 

 

So why are privacy watchdogs edgy about the new feature?

CSAM protection systems are not new, and tech companies like Google have been using them for quite some time, so why is it that Apple’s implementation of the features is under scrutiny?

For starters, Apple has been the torch bearer of user privacy for some time now, and its decision to scan iCloud images and iMessages for CSAM comes as a surprising move amidst its ongoing privacy war with Facebook.

Secondly, scanning images and messages using algorithms is all good, however, human review of flagged content and concerns over whether the system is fool proof or not — not just now but also in the future — has users concerned. 

Lastly and most importantly, we live in a day and age where Pegasus has dominated news and tech companies are under pressure across the globe to either help governments snoop on individuals or provide assistance with the same. In such a scenario the expansion of Apple’s CSAM technology to not just gain access but to target individuals speaking up against their government is a big risk. 

So far Apple has stood by their users in ensuring that the company is not pressurized into divulging information or assisting in breaking their encryption that protect Apple devices, however how long can this last and how far is the company willing to go to ensure that its CSAM protection is not used to threaten the privacy of users is something only the future can tell. 

For now all we can hope is that additional features does its job, helps identify and nab predators, and Apple continues to fight off requests and pressure from governments to compromise on its users’ privacy. 

 

For more stories that cover the ongoings of Delhi NCR, follow us on:
Nabeel Ahmed

Published by
Nabeel Ahmed

Recent Posts

From Seoul to Sicily: Delhi’s cafés serve up the internet’s hottest desserts

From velvety cheesecakes to espresso-soaked gelato, here are seven globally viral dishes that Delhiites cannot…

January 10, 2026

When Nicolás Maduro found echoes in Delhi

As Venezuela’s embattled President Nicolás Maduro returns to global headlines, a walk through Delhi’s diplomatic…

January 10, 2026

Delhi govt begins citywide beautification with Garden of Five Senses, Model Town’s Naini Lake

The Delhi Tourism and Transportation Development Corporation (DTTDC) has estimated a cost of Rs 48.75…

January 10, 2026

Delhi govt increases corpus for Narela Education City to Rs 1,300

Possession letters for land parcels for Delhi Teachers' University and Guru Gobind Singh Indraprastha University…

January 10, 2026

Delhi govt trying to revive DTC that suffered losses of Rs 97,000 in part: CM

Delhi Chief Minister Rekha Gupta also said that the government is trying to turn the…

January 10, 2026

Consider govt employee’s plea to include live-in partner in family pension: Delhi HC to Centre

A bench of Justices Navin Chawla and Madhu Jain held that the petitioner government employee…

January 10, 2026