WhatsApp head says Apple’s child safety update is a ‘surveillance system’
One day after Apple confirmed plans for new software that will allow it to detect images of child abuse on users’ iCloud photos, Facebook’s head of WhatsApp says he is “concerned” by the plans.
In a thread on Twitter, Will Cathcart called it an “Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.” He also raised questions about how such a system may be exploited in China or other countries, or abused by spyware companies.
A spokesperson for Apple disputed Cathcart’s characterization of the software, noting that users can choose to disable iCloud Photos. Apple has also said that the system is only trained on a database of “known” images provided by the National Center for Missing and Exploited Children (NCMEC) and other organizations, and that it wouldn’t be possible to make it work in a regionally-specific way since it’s baked into iOS.
This is an Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have …read more