Instagram adds new protections for accounts that primarily feature children

Meta is introducing additional safeguards for Instagram accounts run by adults that primarily feature children, the company announced on Wednesday. These accounts will automatically be placed into the app’s strictest message settings to prevent unwanted messages, and will have the platform’s “Hidden Words” feature enabled to filter offensive comments. The company is also rolling out new safety features for teen accounts.

Accounts that will be placed into the new, stricter message settings include ones run by adults who regularly share photos and videos of their children, along with accounts run by parents or talent managers that represent children.

“While these accounts are overwhelmingly used in benign ways, unfortunately, there are people who may try to abuse them, leaving sexualized comments under their posts or asking for sexual images in DMs, in clear violation of our rules,” the company wrote in a blog post. “Today we’re announcing steps to help prevent this abuse.”

Meta says it will attempt to prevent potentially suspicious adults, such as people who have already been blocked by teens, from finding accounts that primarily feature children. Meta will avoid recommending suspicious adults to these accounts on Instagram, and vice versa, and make it harder for them to find each other in Instagram Search.

Today’s announcement comes as Meta and Instagram have taken steps over the past year to address mental health concerns tied to social media. These concerns have been raised by the U.S. Surgeon General and various states, some of which have even gone so far as to require parental consent for access to social media.

The changes will significantly impact the accounts of family vloggers/creators and parents running accounts for “kidfluencers,” both of which have faced criticism for the risks associated with sharing children’s lives on social media. A New York Times investigation published last year found that the parents are often aware of their child’s exploitation or even participating in it, by selling photos or clothing their child wore. In The NYT’s examination of 5,000 parent-run accounts, it found 32 million connections to male followers.

The company says the accounts that are placed into these stricter settings will see a notification at the top of their Instagram Feed notifying them that the social network has updated their safety settings. The notice will also prompt them to review their account privacy settings.

Meta notes it has removed almost 135,000 Instagram accounts that were sexualizing accounts that primarily feature children, as well as 500,000 Instagram and Facebook accounts that were associated with the original accounts it had removed.  

Image Credits:Meta

Alongside today’s announcement, Meta is also bringing new safety features to DMs in Teen Accounts, its app experience with built-in protections for teens that are automatically applied.

Teens will now see new options to view safety tips, reminding them to check profiles carefully and be mindful of what they share. Plus, the month and year that the account joined Instagram will be displayed at the top of new chats. In addition, Instagram has added a new block and report option that lets users do both things at the same time.

The new features are designed to give teens more context about the accounts they’re messaging and help them spot potential scammers, Meta says.

“These new features complement the safety notices we show to remind people to be cautious in private messages and to block and report anything that makes them uncomfortable – and we’re encouraged to see teens responding to them,” Meta wrote in the blog post. “In June alone, they blocked accounts 1 million times and reported another 1 million after seeing a safety notice.”

Meta also provided an update on its nudity protection filter, noting that 99% of people, including teens, have kept it turned on. Last month, over 40% of blurred images received in DMs stayed blurred, the company said.

Continue Reading