Monday, April 21

Meta has announced plans to launch a new safety tool aimed at preventing children from receiving and discouraging them from sending nude images, even in encrypted chats, later this year.

The tool is expected to be optional and will be available to adults as well on Instagram and Facebook. This move comes after Meta faced criticism from government and law enforcement agencies following its decision to encrypt Messenger chats by default. Critics argue that encryption will make it more challenging for the company to detect instances of child abuse.

According to Meta, the new feature is designed to protect users, especially women and teenagers, from receiving or feeling pressured to send nude images. Users under the age of 13 are not permitted to use Meta’s platforms. Additionally, Meta announced that minors would default to being unable to receive messages on Instagram and Messenger from strangers.

Police chiefs have highlighted the role of youngsters sending nude images in the increase of sexual offenses committed by children in England and Wales. Legal filings from a US lawsuit against Meta allege that around 100,000 teenage users of Facebook and Instagram face online sexual harassment daily. However, Meta has refuted these claims, accusing the lawsuit of misrepresenting its efforts.

On Thursday, the tech giant revealed plans for a new feature aimed at protecting teenagers from inappropriate images in their messages, including encrypted chats, with further details to be disclosed later this year.

Meta’s decision to protect Facebook Messenger chats with end-to-end encryption (E2EE) by default has drawn criticism from various quarters, including government, law enforcement, and children’s charities. E2EE ensures that only the sender and recipient can read messages, making it difficult for Meta to detect and report child abuse material.

While other messaging apps like Apple’s iMessage, Signal, and Meta-owned WhatsApp use E2EE and defend the technology, some critics advocate for client-side scanning to detect child abuse in encrypted apps. Client-side scanning involves systems on a user’s device scanning messages for known child abuse images before encryption, reporting any suspected illegal content to the company.

Children’s charity NSPCC sees Meta’s new system as a compromise between safety and privacy in E2EE environments. Meta emphasizes that its new feature is not client-side scanning, as it believes this undermines the privacy protection feature of encryption. Instead, the system will use machine learning to identify nudity, operating entirely on the device.

Meta asserts that it has introduced over 30 tools and resources to enhance child safety, including systems to identify suspicious adult behavior and prevent adults from contacting minors. Additionally, new parental supervision tools will give parents more control over teenagers’ safety settings on Instagram and Facebook Messenger.

Share.

Comments are closed.

Exit mobile version