International Coalition of Policy Groups Urges Apple to Abandon ‘Plans to Build Surveillance Capabilities into iPhones’

An international coalition of more than 90 policy and rights groups published an open letter on Thursday urging Apple to abandon its plans to “build surveillance capabilities into iPhones, iPads, and other products” – a reference to the company’s intention to scan users’ iCloud photo libraries for images of child sex abuse (via Reuters).

Child Safety Feature yellow

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups wrote in the letter.

Some signatories of the letter, organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT), are concerned that Apple’s on-device CSAM scanning system could be subverted in nations with different legal systems to search for political or other sensitive content.

“Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit,” reads the letter.

The letter also calls on Apple to abandon planned changes to iMessage in family accounts, which would try to identify and blur nudity in children’s messages, letting them view it only if parents are notified. The signatories claim that not only could the step endanger children in intolerant homes or those seeking educational material, it would also break end-to-end encryption for iMessage.

Some signatories come from countries in which there are already heated legal battles over digital encryption and privacy rights, such as Brazil, where WhatsApp has been repeatedly blocked for failing to decrypt messages in criminal probes. Other signers are based in India, Mexico, Germany, Argentina, Ghana and Tanzania. Groups that have also signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Apple’s plan to detect known CSAM images stored in iCloud Photos has been particularly controversial and has prompted concerns from security researchers, academics, privacy groups, and others about the system potentially being abused by governments as a form of mass surveillance. The company has tried to address concerns by publishing additional documents and a FAQ page explaining how the image-detection system will work and arguing that the risk of false detections is low.

Apple has also said it would refuse demands to expand the image-detection system beyond pictures of children flagged by recognized databases of child sex abuse material, although as Reuters points out, it has not said that it would pull out of a market rather than obeying a court order.

Leave a Reply

Your email address will not be published. Required fields are marked *