Apple To Scan All iPhones To Look For Child Abuse Content: A Breach Of Privacy?

|

Apple has officially confirmed that starting with the iOS 15, iPadOS 15, and macOS Monterey the company will scan the pictures before uploading them to iCloud Photos. According to Apple, this is done to prevent children from "predators" who might use Apple devices to recruit and exploit children to spread Child Sexual Abuse Material (CSAM).

Apple To Scan All iPhones To Look For Child Abuse Content

The company has now developed a new tool in collaboration with child safety experts. This tool is designed to offer more control to parents regarding the use of communication tools by children. Apple devices will soon use an on-device machine learning feature to scan the devices for sensitive content.

Cryptography To Limit CSAM Content Spreading

Apple has also developed a new cryptography tool to help limit the CSAM content spreading. According to Apple, this tool can detect child-abusing content and to help Apple provide accurate information to law enforcement. Apple has also updated Siri to give more information about report CSAM or child exploitation with just a simple search.

Apple To Scan All iPhones To Look For Child Abuse Content

How Does It Work In Real-Life?

This will be used on apps like Messenger, where, when someone receives sensitive content, it will be blurred by default. Additionally, children will be warned about the content along with helpful resources. Besides, parents will also get a notification about the same. On the same line, if a child tries to send a sexually explicit photo, they will be warned about the same, and a parent can receive a message if they still send the photo.

Do note that, in both cases, Apple will not get any access to those photos. As a part of the process, Apple will also scan the photos stored on the iCloud to detect known CSAM images stored in iCloud Photos. According to Apple, this feature will be helpful to assist National Center for Missing and Exploited Children (NCMEC).

This is done using device matching technology, which compares the available photos with the known CSAM image hashes provided by NCMEC and other child safety organizations. A matched CSAM photo will be uploaded to the iCloud along with a cryptographic safety voucher that encodes the match result along with the additional encrypted data.

Apple won't be able to see these CSAM tagged photos. However, when the threshold of the CSAM photos exceeds, Apple will manually review the reports and confirm if there is a match and then disables the user's account. Lastly, Apple will also send this report to NCMEC. If a user's account is tagged by mistake, one can also appeal to Apple to get their account reinstated.

A Breach Of Privacy?

Apple usually advertises that their devices are very private. However, this development sounds otherwise. Though the company has developed a tool to prevent child abuse, there is no information on what happens if this tool falls into the wrong hand.

Source

Best Mobiles in India

Read More About: apple news iPhone iCloud

Best Phones

Get Instant News Updates
Enable
x
Notification Settings X
Time Settings
Done
Clear Notification X
Do you want to clear all the notifications from your inbox?
Yes No
Settings X
X