Aug 06, 2021

Apple to scan iPhones for child sex abuse images

Apple has announced details of a system to find child sexual abuse material (CSAM) on customers' devices.
 
Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.

Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.

However there are privacy concerns that the technology could be expanded to scan phones for prohibited content or even political speech.

Experts worry that the technology could be used by authoritarian governments to spy on its citizens.

Apple said that new versions of iOS and iPadOS - due to be released later this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".

(BBC News) 

Top