Logged-out Icon

New Instagram Features Aim to Safeguard Young Users from Sextortion and Nudity

Instagram, under Meta's ownership, has announced a set of robust features designed to shield users, especially youngsters, from the perils of intimate image abuse and sextortion

Instagram Users Can Now Download Public Reels with Creator Watermarks

Instagram has announced a set of new features aimed at protecting users, particularly young people, from intimate image abuse and sextortion. These tools come as a response to the growing concern over the platform’s inability to adequately safeguard its users from child sexual abuse material and other forms of online exploitation.

One of the most notable updates is the introduction of nudity protection in private messages. This technology, which Meta first confirmed it was developing in 2022, will be automatically activated for users under the age of 18. Using machine learning algorithms that analyze images on the user’s device, the tool will detect and blur images suspected of containing nudity before they reach the recipient. This process ensures that messages remain end-to-end encrypted, with Meta never having access to the content. Users will have the option to view the blurred image, alongside a pop-up message from Meta discouraging them from feeling pressured to respond, as well as a safety tips button and an option to block the sender.

In addition to the nudity protection feature, Meta’s new tool will also detect when a person is sending a nude image and provide a warning about the potential risks of sharing sensitive photos. The tool will remind users that they have the option to delete a message before it is seen by the recipient. Furthermore, when someone attempts to forward a message containing detected nudity, a final warning will appear, reminding them to be responsible and respectful, although the image will still be allowed to be forwarded.

To further protect teens from potential scammers or sextortionists, Meta has implemented additional measures to make it more difficult for these bad actors to approach young users. Message requests from suspected scammers will now be directed to hidden requests, and users already engaged in a conversation with these individuals will receive a warning containing boundary reminders and instructions on how to report the user. Previously, Meta had barred people from messaging users 16 or under if they weren’t mutually connected, even if the other account claimed to be the same age. With the new update, potential scammers won’t even see the option to message a teen, even if they follow each other.

While these new features demonstrate Instagram’s commitment to improving user safety, particularly for its younger audience, it is important to acknowledge that the platform still has a long way to go in effectively combating child sexual abuse material and other forms of online exploitation. As social media continues to play an increasingly significant role in the lives of young people, it is crucial that platforms like Instagram remain vigilant in their efforts to create a safer online environment and protect their most vulnerable users.

This website uses cookies to ensure you get the best experience on our website