WhatsApp head calls Apple’s CSAM system “very concerning”

The head of Facebook-owned Whatsapp slammed Apple’s decision to scan iPhones for child abuse images in a Twitter thread.

Apple has been seen as the best-ever privacy and security monitored company by the users. The tech experts are now behind the American tech giant.

Apple previously announced a plan to release software that could search and detect child sexual abuse material on the phones of US users. Human reviewers could then alert authorities of potential illegal activity.

The system which is currently being tested in America by Apple will take to other countries in the future. 

Will Cathcart, the head of Facebook’s WhatsApp, says that this system would allow access to “scan all of a user’s private photos on your phone, even photos you haven’t shared with anyone.”  He added that WhatsApp has worked to streamline ways to report and ban those who traffic in CSAM without breaking the encryption and the privacy of its users. But Apple’s spokesperson denied some of Cathcart’s claims pointing that new Apple software would only detect child sex abuse materials in iCloud, which users can disable at any time.

Cathcart pointed to a 2016 letter from Apple to its consumers, where the company says: “it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.”

The Apple spokesperson said Apple uses one operating system globally, which gives no way for specific governments or actors to modify the system for a specific region or device.

Apple’s new measure:

As noted, Apple wants to protect children on its platform from grooming, exploitation, and to prevent the spread of CSAM (child sexual abuse material). To that end it has announced three new measures:

  • Communication safety in Messages
  • CSAM detection
  • Expanding guidance in Siri and Search

Read also: WhatsApp launches its new feature ‘view once’ disappearing photos and videos

The new measures are only available to children who are members of a shared iCloud family account. The new system does not work for anyone over the age of 18. so can’t prevent or detect unsolicited images sent between two co-workers, for instance, as the recipient must be a child.

Under 18s on Apple’s platforms are divided further still. If a child is between the ages of 13-17, parents won’t have the option to see notifications. However, children can still receive the content warnings. For children under 13, both content warnings and parental notifications are available.

Leave a Reply

Your email address will not be published.