Apple Provides Further Clarity on Why It Abandoned Plan to Detect CSAM in iCloud Photos

Apple Provides Further Clarity on Why It Abandoned Plan to Detect CSAM in iCloud Photos Apple on Thursday provided its fullest explanation yet for last year abandoning its controversial plan to detect known Child Sexual Abuse Material (CSAM) stored in iCloud Photos.
Apple's statement, shared with Wired and reproduced below, came in response to child safety group Heat Initiative's demand that the company "detect, report, and remove...

Visit Direct Link

You Might Also Like

Post comment