Apple wants to scan your iPhone photos for child porn content

Apple announces new measures to limit the spread of Child Sexual Abuse Material (CSAM). The Cupertino based tech giant introduces a tool to scan CSAM or child sexual abuse content stored in your iPhone. These new CSAM detection features will roll out with future iOS 15, iPadOS 15, watchOS 8, and macOS Monterey versions.

CSAM detection features will work across three areas including Photos, Siri and search, and Messages. Apple says that these measures have been developed in collaboration with child safety experts and ensures users privacy.

The new CSAM detection tool will allow it to detect known child abuse images stored in iCloud Photos. Apple claims, instead of scanning images in the cloud, the new tool performs “on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations.

Keeping privacy in mind, Apple ensures that this database is transformed into an “unreadable set of hashes that is securely stored on users’ devices”.

Privacy in mind

Apple explains that this “matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result.” It highlights that the iPhone “creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.”

It is using another technology called “threshold secret sharing” to ensure the contents of the safety vouchers “cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content”. “Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images.” Apple explains.

iphone scanning, csam, apple cilld por

Image: Apple

More CSAM detection measures

The company also announced to add new tools to Messages app to warn children and their parents when receiving or sending sexually explicit photos. Apple explains, on receiving such child abuse content, the photo will be “blurred and the child will be warned and presented with helpful resources and reassured it is okay if they do not want to view this photo.” The tool will also let parents be informed when a child views such child abuse content.

Similarly, if a child attempts to send sexually explicit photos, he/she will be warned before the photo is sent, and the parents will also have the option to receive a message if the child chooses to send it. The company claims that the “feature is designed so that Apple does not get access to the messages”.

Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online. Siri will be able to inform users on how to report CSAM. Both Siri and Search will also be able to intervene when users perform searches for queries related to CSAM.

The post Apple wants to scan your iPhone photos for child porn content appeared first on BGR India.