Just In
- 3 hrs ago OPPO Find X7 Ultra Camera Deep-Dive: Pushing the Boundaries of Photography on a Smartphone
- 5 hrs ago iQOO Z9 Turbo Launched in China: Snapdragon 8s Gen 3, 16GB RAM, and More
- 5 hrs ago iQOO Z9, Z9x Launched in China: 6000mAh Battery, iQOO 12 Inspired Design, 50MP Camera, & More
- 6 hrs ago HMD Branded First Set of Android Smartphones Are Here!
Don't Miss
- Movies Kota Factory 3 OTT Release Date, Platform: When Will Jitendra Kumar Web Series Premiere On Netflix?
- News Irish Travelers Cautioned Amid Canary Islands Protest Against Tourism
- Lifestyle Backless Dress: 7 Tips For Choosing Perfect Bra For Backless Outfits And Turn Heads Wherever You Go
- Sports DC vs GT IPL 2024: Why Shubman Gill Held Back Spinner R Sai Kishore Till 19th over?
- Travel Escape to Kalimpong, Gangtok, and Darjeeling with IRCTC's Tour Package; Check Itinerary
- Finance DCB Bank Q4 Results: PAT Grew 9% To Rs 156 Cr, NII Jumps 4.5%; Dividend Declared
- Education MP Board Class 10th, 12th Results 2024, Know Alternative Ways to Check Your Result
- Automobiles Aston Martin Vantage Launched In India At Rs 3.99 Crore
Apple To Scan All iPhones To Look For Child Abuse Content: A Breach Of Privacy?
Apple has officially confirmed that starting with the iOS 15, iPadOS 15, and macOS Monterey the company will scan the pictures before uploading them to iCloud Photos. According to Apple, this is done to prevent children from "predators" who might use Apple devices to recruit and exploit children to spread Child Sexual Abuse Material (CSAM).
The company has now developed a new tool in collaboration with child safety experts. This tool is designed to offer more control to parents regarding the use of communication tools by children. Apple devices will soon use an on-device machine learning feature to scan the devices for sensitive content.
Cryptography To Limit CSAM Content Spreading
Apple has also developed a new cryptography tool to help limit the CSAM content spreading. According to Apple, this tool can detect child-abusing content and to help Apple provide accurate information to law enforcement. Apple has also updated Siri to give more information about report CSAM or child exploitation with just a simple search.
How Does It Work In Real-Life?
This will be used on apps like Messenger, where, when someone receives sensitive content, it will be blurred by default. Additionally, children will be warned about the content along with helpful resources. Besides, parents will also get a notification about the same. On the same line, if a child tries to send a sexually explicit photo, they will be warned about the same, and a parent can receive a message if they still send the photo.
Do note that, in both cases, Apple will not get any access to those photos. As a part of the process, Apple will also scan the photos stored on the iCloud to detect known CSAM images stored in iCloud Photos. According to Apple, this feature will be helpful to assist National Center for Missing and Exploited Children (NCMEC).
This is done using device matching technology, which compares the available photos with the known CSAM image hashes provided by NCMEC and other child safety organizations. A matched CSAM photo will be uploaded to the iCloud along with a cryptographic safety voucher that encodes the match result along with the additional encrypted data.
Apple won't be able to see these CSAM tagged photos. However, when the threshold of the CSAM photos exceeds, Apple will manually review the reports and confirm if there is a match and then disables the user's account. Lastly, Apple will also send this report to NCMEC. If a user's account is tagged by mistake, one can also appeal to Apple to get their account reinstated.
A Breach Of Privacy?
Apple usually advertises that their devices are very private. However, this development sounds otherwise. Though the company has developed a tool to prevent child abuse, there is no information on what happens if this tool falls into the wrong hand.
-
99,999
-
1,29,999
-
69,999
-
41,999
-
64,999
-
99,999
-
29,999
-
63,999
-
39,999
-
1,56,900
-
79,900
-
1,39,900
-
1,29,900
-
65,900
-
1,56,900
-
1,30,990
-
76,990
-
16,499
-
30,700
-
12,999
-
11,999
-
16,026
-
14,248
-
14,466
-
26,634
-
18,800
-
62,425
-
1,15,909
-
93,635
-
75,804