unsplash
Meta has recently announced its new testing feature that will help fight sexual scams and forms of ‘image abuse.’
According to recent development, Instagram’s new feature will automatically blur nudity in direct messages. The social media platform on Thursday said that it was testing the features as part of its campaign to fight sexual scams and other forms of “image abuse”, and to make it tougher for criminals to interact with teens.
In addition to this, the platform will also serve as a warning to teens to make them think twice before sharing intimate images. Meta believes that this will boost protection against scammers.
Notably, Meta owns Facebook and WhatsApp, but this feature will not be available on those platforms.
The protection feature aims to protect teen users of Instagram from cyberflashing by putting nude images behind a safety screen. Users will be prompted whether or not to view such images.
In another step, Meta announced that it will also increase the data it is sharing with the cross-platform online child safety program, Lantern, to include more “sextortion-specific signals.”
This has come after one of the recent high-profile cases, which included two Nigerian brothers who pleaded guilty to sexually extorting teen boys and young men in Michigan, including one who took his own life, and a Virginia sheriff’s deputy who sexually extorted and kidnapped a 15-year-old girl.
Copyright © 2025 Top Indian News