Apple will not allow governments to spy through its child abuse detection tool, Telecom News, ET Telecom


San Francisco: Facing criticism from many sides of its iCloud Photos and Child Safety Messages initiatives, Apple stressed that it would not allow any government to conduct surveillance through the tool to detect and curb child pornography ( CSAM) in iCloud Photos.

Apple last week confirmed plans to roll out new technology in iOS, macOS, watchOS, and iMessage that will detect potential child abuse images.

Apple has said it will not accede to any requests from a government to expand the technology.

Apple will deny such requests. We have already faced requests to create and deploy government-imposed changes that degrade user privacy, and we have firmly denied those requests. We will continue to deny them at the future, ”the company said in a new document.

Apple said the tool has no impact on users who have not chosen to use iCloud Photos.

“There is no impact to other data on the device. This feature does not apply to messages,” the company noted.

Epic Games CEO Tim Sweeney had attacked Apple over its iCloud photos and messages regarding child safety initiatives.

“This is government spyware installed by Apple on the basis of a presumption of guilt. Although Apple wrote the code, its function is to analyze personal data and report it to the government,” Sweeney said on Twitter.

WhatsApp chief Will Cathcart also criticized Apple for its intention to launch photo ID measures, claiming that Apple software can scan all private photos on your phone in a blatant breach of privacy. .

Noting that WhatsApp will not allow such Apple tools to run on its platform, Cathcart said Apple has long needed to do more to combat child pornography (CSAM), “but the approach they adopt introduces something of great concern to the world “. .

Apple said that the “CSAM Detection in iCloud Photos” tool is designed to prevent CSAM from avoiding iCloud photos without providing information to Apple about photos other than those that match known CSAM images.

“This technology is limited to detecting CSAM stored in iCloud and we will not accede to any request from a government to extend it.”

The company further stated that the feature does not work on the iPhone’s private photo library on the device.


Source link

Previous The cheapest (and most expensive) mobile data plans
Next The GSM and GPRS Modules Market will generate huge returns over the estimated period of 2021 to 2027

No Comment

Leave a reply

Your email address will not be published. Required fields are marked *