Apple Confirms Scanning Of iCloud Photos For Images Of Child Abuse

Apple is planning to roll out a new system in the U.S. to expand protections for children to help them stay safe from online predators and limit the spread of Child Sexual Abuse Material (CSAM).

For this, the Cupertino giant is introducing new child safety features in three areas, developed in collaboration with child safety experts. These areas are communication safety in Messages, CSAM detection, and expanding guidance in Siri and Search.

โ€œAt Apple, our goal is to create technology that empowers people and enriches their lives โ€” while helping them stay safe,โ€ wrote Apple in a blog post.ย โ€œThis program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.โ€

Communication Safety In Messages

The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos. This will also enable parents to play a more informed role in helping their children navigate communication online.

When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that to make sure they are safe, their parents will get a message if they do view it.

Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

The Messages app will use on-device machine learning to analyze image attachments and determine if a photo is sexually explicit while keeping private communications unreadable by Apple.

CSAM Detection

CSAM refers to content that depicts sexually explicit activities involving a child. The iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online.

Further, Appleโ€™s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by the NCMEC (National Center for Missing and Exploited Children) and other child safety organizations.

Apple further transforms this database into an unreadable set of hashes that is securely stored on usersโ€™ devices.

The hashing technology, called NeuralHash, analyzes an image and converts it to a unique number specific to that image. Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value.

If thereโ€™s a match between a userโ€™s photos and the CSAM database, Apple manually reviews each report to confirm there is a match, disables the userโ€™s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged, they can file an appeal to have their account reinstated.

The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM.

It does so while providing significant privacy benefits over existing techniques since Apple only learns about usersโ€™ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.

Expanding Guidance In Siri And Search

Apple is also expanding guidance in Siri and Search by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

Siri and Search will also intervene when users try to search for CSAM-related topics. These interventions will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.

โ€œAppleโ€™s expanded protection for children is a game changer,โ€ John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement.

โ€œWith so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.โ€

Appleโ€™s technology balances โ€œthe need for privacy with digital safety for children,โ€ said Julia Cordua, the CEO of Thorn, aย non-profit organization that builds technology to combat child sexual exploitation.

All the above-mentioned updates to iPhoneโ€™s Messages App, Siri, and Search are coming later this year in an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

Subscribe to our newsletter

To be updated with all the latest news

Kavita Iyer
Kavita Iyerhttps://www.techworm.net
An individual, optimist, homemaker, foodie, a die hard cricket fan and most importantly one who believes in Being Human!!!

Subscribe to our newsletter

To be updated with all the latest news

Read More

Suggested Post