Apple Provides Further Clarity on Why It Abandoned Plan to Detect CSAM in iCloud Photos

Apple on Thursday provided its fullest explanation yet for last year abandoning its controversial plan to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos. Apple’s statement, shared with Wired and reproduced below, came in response to child safety group Heat Initiative’s demand that the company “detect, report, and remove” CSAM from iCloud […]

Read More

Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos

In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED. Apple’s full statement: After extensive consultation with experts to gather feedback on child protection initiatives we […]

Read More

Apple Remains Silent About Plans to Detect Known CSAM Stored in iCloud Photos

It has now been over a year since Apple announced plans for three new child safety features, including a system to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, an option to blur sexually explicit photos in the Messages app, and child exploitation resources for Siri. The latter two features are […]

Read More