Online harassment is a problem for many women, who find their emails often filled with sexually explicit, unsolicited photos. Instagram is calling create a tool to solve the problem.
Developer Alessandro Paluzzi noticed an interesting detail in the app’s code: automatically block sexual messages. Everything works using an AI that analyzes the footage.
Instagram wants to fight online harassment
This tool is, on paper, an excellent tool for combating online sexual harassment. When a photo is sent in a private message, it’s analyzed by an AI that determines if it’s a nude shot. If that’s the case, The photo was sent, but it is blurry. The user can then choose whether to remove the blur or not. Important disclosure: Instagram analyzes photos but does not have access to them.
The Verge in Meta’s columns confirmed that it is working on a feature like this:
We are working closely with experts to ensure that these new options protect users’ privacy while giving them full control over the messages they receive.
Currently, the feature is still under development. he can not online for months. However, it’s already encouraging to see Meta starting to take action on these sensitive issues. The American company has long been chosen for its inactivity in this space, making Instagram a real jungle when it comes to private messages.
The social network took another step in protecting its users a few weeks ago by automatically hiding sensitive content for anyone under the age of 16. Therefore, blocking sexually explicit messages is the next step.
#Instagram Working on nudity protection for chats 👀
ℹ️ The technology on your device covers photos that may contain nudity in chats. Instagram cannot access photos. pic.twitter.com/iA4wO89DFd
— Alessandro Paluzzi (@alex193a) 19 September 2022
Source : Boundary