Apple Abandons Controversial Plans to Detect Known CSAM in iCloud Photos
In addition to making end-to-end encryption available for iCloud Photos, Apple today announced that it has abandoned its controversial plans to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos, according to a statement shared with WIRED. Apple’s full statement: After extensive consultation with experts to gather feedback on child protection initiatives we […]
Read More