Proactive Investors - Meta Platforms Inc (NASDAQ:FB) will hide certain “age-inappropriate” content from the social media feeds of teenagers on its platforms, the company announced Tuesday.
Content related to self-harm, suicide and eating disorders will not appear in minors’ feeds or Stories on Facebook (NASDAQ:META) or Instagram, even if that content was posted by an account they follow.
The company already put restrictions on those topics being recommended to teen users via Instagram’s Reels or Explore features.
Meta also said it will hide more search results related to those topics and provide people with resources for help if they do search for related material.
"We want teens to have safe, age-appropriate experiences on our apps," the company wrote in a blog post.
Additionally, teens will be subject to the most restrictive Facebook and Instagram content control settings by default, the company said.
Meta also said it will send notifications to young users to prompt them to make their accounts more private, adding that it has "developed more than 30 tools and resources to support teens and their parents.”
The new policy comes after Meta has faced backlash in recent years, including a lawsuit in 2022 from a family that alleged Instagram recommended their teen daughter content that cast self-harm and anorexia in a positive light.
In October, a bipartisan cohort of 33 state attorneys general filed a lawsuit in relation to what they claimed were addictive features designed for young people.