Instagram May Add a ‘Nudity Protection’ Filter to Safeguard Users

instagram logo

Instagram appears to be testing a new feature that would cover photos that may contain nudity in Direct Messages and safeguard users from unwanted content exposure.

The “nudity Protection” setting was discovered by Alessandro Paluzzi ,, a developer who is known for developing reverse engineering applications and discovering early versions of the upcoming updates.

#Instagram is working on nudity protection for chats

i Technology on your device covers photos that may contain nudity in chats. Instagram can’t access your photos. pic.twitter.com/iA4wO89DFd

— Alessandro Paluzzi (@alex193a) September 19, 2022

The new nudity protection option would enable Instagram to activate the nudity detection element in iOS, which scans incoming and outgoing messages on a user’s device to detect potential nudes in attached images.

If the nudity protection feature is selected, Instagram will automatically blur an image if the app detects a photo with nudity in Direct Messages. An email notification will be sent to the user by the app informing them that an image may have nudity. The button can then be accessed if the user wishes.

According to Paluzzi’s screenshot, Nudity Protection is an option you can turn on or off in iOS Settings.

In Paluzzi’s screenshot, Instagram reassures users that it “cannot access the photos”. It is just “technology on your phone [that] that covers photos that might contain nudity .”

This message indicates that Instagram doesn’t download or examine images sent directly to them. Instead iOS technology on an Apple device will be able to access messages and filter based on the content.

However, Apple tried to assure that the filtering was done using Artificial Intelligence and Data Matching, and does not track or trace the details of any user’s internet interactions.

However, this new nudity protection function is an important step by Meta, Instagram’s parent company. Meta has worked hard to improve protection for young users.

Meta has faced serious questions about its efforts to keep younger users safe on its platforms. Meta received eight lawsuits in June alleging that the company had deliberately altered its algorithm to attract young users.

Earlier this month, Meta was fined a record $402 million for letting teenagers set up accounts on Instagram that publicly displayed their phone numbers and email addresses.


Image credits: Header photo licensed via Depositphotos.

Loading...