Apple provides more details on the process behind its new child abuse measures.
Apple Announces New Technology to Prevent Child Abuse – HOW IT WORKS AND WHY IT COULD BE A PROBLEM
The tech giant announced a new policy last week that uses technology to spot potential child abuse images in iCloud and Messages. The Verge reports that Apple released a FAQ page in recent days explaining how the technology is used and what the privacy implications look like after people raised concerns about the new measures.
Apple said the technology is specifically limited to detecting child sexual abuse material (CSAM) and cannot be used as a surveillance tool.
“One of the biggest challenges in this space is protecting children while ensuring user privacy,” Apple writes on the new FAQ page.