In 2021, Meta restricted adults on Instagram from being able to message under-18 users who don't follow them. Now, it's expanding that rule to help protect younger teens from potentially unwanted contact. Users under 16 — or 18, depending on their country — can no longer receive DMs from anybody they don't follow by default, even if they're sent by fellow teens.
This new safety measure applies to both Instagram and Messenger. For Messenger, in particular, young users will only be able to receive messages from their Facebook friends or people in their phone contacts. Since this setting is enabled by default, teens who have accounts under parental supervision will need to get any changes to it approved by their guardian. Of course, the setting will have to depend on a user's declared age and Meta's technology designed to predict people's ages, so it's not 100 percent foolproof.
"We want teens to have safe, age-appropriate experiences on our apps," Meta said in its announcement. Earlier this month, Meta announced that it will start hiding content related to self-harm, graphic violence, eating disorders and other harmful topics from teens on Instagram and Facebook. If a user is under 16, they won't see posts with those topics in their Feeds and Stories even if they're shared by accounts they follow. It also recently rolled out a mindfulness feature that will send "nighttime nudges" to teens under 18 to close the app and go to bed if they've been scrolling for more than 10 minutes.
Meta made these changes after being hit by lawsuits and complaints related to how it protects its younger userbase. An unsealed lawsuit filed against the company by 33 states accuses it of actively targeting children under 13 to use its apps and websites and of continuing to harvest their data even after it's already aware of their ages. A Wall Street Journal report also accused Instagram of serving "risqué footage of children as well as overtly sexual adult videos" to accounts that follow teenage influencers. In December 2023, the state of New Mexico sued Meta, claiming that Facebook and Instagram algorithms recommended sexual content to minors. And just this month, The Wall Street Journal reported on unredacted internal Meta presentations related to that case. Apparently, 100,000 child users were harassed daily on Facebook and Instagram based on employees' estimates, underlining the need for stricter measures on its platforms.