Categories: Tech

CSAM detection in iCloud Photos; is it a threat to privacy?

Published by
Nabeel Ahmed

Apple’s latest addition in its software to “limit the spread of Child Sexual Abuse Material (CSAM)” has some users cheering it on, while others are concerned about its threat to privacy

In recent months, Apple has been at the forefront when it comes to protecting its users privacy, so much so that the tech giant has been at loggerheads with Facebook over the former limiting cross platform and individual app tracking on its devices. Now, however, with Apple announcing that there will be CSAM detection in iCloud images and iMessages, users are concerned about their privacy, and for good reason. 

On first glance the addition of the features and its implementation seem like a backdoor to users’ iCloud and iMessage images through which the company will be screening images for Child Sexual Abuse Material. On its part Apple has clarified that only users who are using iCloud image with family accounts and iMessage to share pictures will be screened. “Communication safety in Messages is only available for accounts set up as families in iCloud. Parent/guardian accounts must opt in to turn on the feature for their family group. Parental noti- fications can only be enabled by parents/guardians for child accounts age 12 or younger” the company revealed in a document explaining the need, use and limitations of the CSAM feature. 

In the document Apple also clarified that the feature cannot be used to screen users images and messages on their native devices and that it will notify parents/ guardians about CSAM images on child accounts only after warning the child account beforehand. To further reassure users against the company gaining access to their images and messages or the system being used to gain access to data other than CSAM material, the tech giant said that “Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations”.

The document also said that authorities will only be notified after the CSAM system has detected explicit material, the user notified and the images cross checked by a human reviewer. Therefore the chances of misreporting will be negligible. However, since even possession of such material is a punishable offence in many countries, after a human review the account will be reported to the concerned authorities. 

 

So why are privacy watchdogs edgy about the new feature?

CSAM protection systems are not new, and tech companies like Google have been using them for quite some time, so why is it that Apple’s implementation of the features is under scrutiny?

For starters, Apple has been the torch bearer of user privacy for some time now, and its decision to scan iCloud images and iMessages for CSAM comes as a surprising move amidst its ongoing privacy war with Facebook.

Secondly, scanning images and messages using algorithms is all good, however, human review of flagged content and concerns over whether the system is fool proof or not — not just now but also in the future — has users concerned. 

Lastly and most importantly, we live in a day and age where Pegasus has dominated news and tech companies are under pressure across the globe to either help governments snoop on individuals or provide assistance with the same. In such a scenario the expansion of Apple’s CSAM technology to not just gain access but to target individuals speaking up against their government is a big risk. 

So far Apple has stood by their users in ensuring that the company is not pressurized into divulging information or assisting in breaking their encryption that protect Apple devices, however how long can this last and how far is the company willing to go to ensure that its CSAM protection is not used to threaten the privacy of users is something only the future can tell. 

For now all we can hope is that additional features does its job, helps identify and nab predators, and Apple continues to fight off requests and pressure from governments to compromise on its users’ privacy. 

 

For more stories that cover the ongoings of Delhi NCR, follow us on:
Nabeel Ahmed

Published by
Nabeel Ahmed

Recent Posts

Delhi to host all India K-pop contest 2024 grand finale tomorrow

The finalists will compete in two categories—dance and vocals—vying for the title and an opportunity…

November 22, 2024

Delhi: Winter of misery for the homeless in the national capital

With the abrupt fall in temperatures, the destitute are left exposed to the chill without…

November 22, 2024

Maitri Bus Service bridges Delhi-Kathmandu, carrying 17,603 passengers in one year

Among the passengers, 4,782 were Indian citizens, 12,471 were Nepalese nationals and 350 came from…

November 22, 2024

Delhi air pollution: SC to consider on Nov 25 if GRAP-4 curbs can be relaxed

A bench of Justices Abhay S Oka and Augustine George Masih expressed displeasure over the…

November 22, 2024

Delhi’s rooftop cafés struggle as air pollution clouds winter charm

With smog choking the capital, iconic open-air spots face dwindling footfall and rising customer concerns

November 22, 2024

Exhibition: The Art of Quilting

The exhibition highlights quilting’s transformation from functional bed coverings to a contemporary art form

November 22, 2024