Friday, November 22

Apple has scrapped plans to release a contentious tool that would scan iPhones, iPads, and iCloud photos for child sexual abuse material (CSAM), following criticism from critics who cited the feature’s potential privacy implications.

Apple first announced the feature in 2021, with the goal of combating child exploitation and promoting safety, both of which the tech community has become increasingly supportive of. But, in the face of widespread criticism, it halted implementation, stating that it would “take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple announced in a public statement on Wednesday that it had “decided not to move forward with our previously proposed CSAM detection tool for iCloud Photos.”

“Children can be protected without companies sifting through personal data,” the company said in a statement provided to Wired. “We will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all.”

Instead, after consulting experts for feedback on its child protection initiatives, the company is refocusing its efforts on expanding its Communication Safety feature, which it first made available in December 2021. The Communication Safety tool is an opt-in parental control feature that alerts minors and their parents when image attachments in iMessage are sexually explicit and blur them if they are.

In 2021, Apple was chastised for its plan to launch a new tool that would begin scanning iOS devices and iCloud photos for child abuse imagery. The tool, according to the company, would convert photos on iPhones and iPads into unreadable hashes — or complex numbers — stored on user devices.

Once the images were uploaded to Apple’s iCloud storage service, those numbers would be compared against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC).

Many child safety and security experts applauded the effort, acknowledging the moral duties and obligations a business has towards the goods and services it produces. However, they also described the efforts as “very disturbing,” partly due to the fact that a portion of Apple’s procedure for vetting child abuse photographs is carried out directly on user devices.

Apple made an effort to allay concerns that governments may also pressure Apple to include non-child abuse photographs to the hash list in a PDF document explaining the technology, which it named NeuralHash. It said, “Apple will reject any such demands.” “We have previously faced requests to develop and implement legally required changes that compromise user privacy, and we have steadfastly rebuffed those requests. They will still be rejected by us in the future.

In addition to announcing a number of new security improvements, Apple also disclosed that it had abandoned its ambitions for the tool.

Apple wants to expand end-to-end encryption of iCloud data to cover backups, photographs, notes, chat histories, and other services, a move that may increase tensions with law enforcement officials around the world while also protecting user data. Even in the event of an Apple data breach, customers will be able to keep specific data more secure from hackers, governments, and spies with the help of the program, called Advanced Data Protection, the company claimed.

Share.
Leave A Reply Cancel Reply
Exit mobile version