Social media site Instagram is introducing several new safety features in the hopes of combatting rising rates of sextortion on the platform.
The new safety features are designed to complement the recently introduced Instagram Teen accounts, which were introduced in September. Teen accounts are private as standard, but malicious accounts can still request to follow, and teens can allow the request and follow back accounts that may be harmful.
Sextortion is a term for a rising kind of scam particularly being targeted at teens and young people. A scammer will fraudulently gain the trust of a user, coerce them into sending intimate photos, and then extort them for money with threats of sending the photos to the target’s friends or family.
What are Instagram’s new safety features?
One of the enhanced safety features introduced is an algorithm to detect when accounts are engaging in suspicious or scammy behavior. If an account triggers the detection algorithm, it will be flagged, and depending on just how suspicious the algorithm determines it is being, follow requests from the account will be sent to spam, or simply blocked altogether.
In addition, because data shows that teens are often targeted by scammers from other countries, Instagram is experimenting with safety notices in Instagram DMs and Messenger messages. These alerts will tell users to be careful when interacting with certain accounts.
One of the key safety features being added will block flagged accounts from seeing the follower or following lists of users. Scammers frequently use these lists to blackmail users, and hiding the lists from them will help to prevent this. They will also be prevented from seeing who has liked posts, tagged photos, and other accounts tagged in a user’s photos.
Hi 🫶 Today we’re introducing Teen Accounts, a new experience for teens with built-in safety features, plus more ways to see content you like.
Tap through for more info 👇https://t.co/rzkXknlMGa pic.twitter.com/YSRYxQkfpa
— Instagram (@instagram) September 17, 2024
A key part of Instagram’s new safety features is the blocking of screenshots of ‘view once’ or ‘allow replay’ images sent via Instagram DM or Messenger. If a user attempts to screenshot a limited photo or video, they will be prevented from doing so. Accompanying this restriction will be the prevention of opening these images on the Instagram web, so users can’t circumvent the anti-screenshot protection.
Nudity filtering, which has been in testing since April, will now be turned on for all teen accounts by default, along with added warnings of the risks of sending sensitive images.
In addition to these safety measures, Meta has partnered with several organizations. The National Center for Missing and Exploited Children, Thorn, and Crisis Text Line are all collaborating with the tech giant to provide informative content and support to teen users.
Full details of all the safety features and educational campaigns are in Meta’s press release.
Featured image credit: generated with Ideogram