Teen Instagram and Facebook Accounts Face Content Restrictions

In response to increased scrutiny over the impact of its platforms on young users, particularly in the areas of self-harm and eating disorders, Meta has announced new restrictions on the content that teen accounts can see on its platforms. The changes will affect both Instagram and Facebook, making it more difficult for teens to access content related to self-harm, eating disorders, and suicide. Key changes include:

  • Hiding related content: When users search for terms related to self-harm, eating disorders, or suicide, related results will be hidden, and instead, users will be directed to expert resources for help.
  • Restricting content in Feed and Stories: Even if the content is shared by someone a teen follows, it will not be shown in their Feed or Stories.
  • Automatic content control settings: All teen accounts on Instagram and FaceBook will be automatically placed in the most restrictive content control setting.

Starting immediately, Instagram and Facebook will implement measures to limit teen users’ exposure to content that revolves around self-harm and eating disorders. This initiative aims to mitigate the potential negative impact of such content on vulnerable individuals, acknowledging the influence social media can have on mental health.

These changes come after years of criticism and lawsuits accusing Facebook of contributing to the youth mental health crisis. While the platform already does not recommend this type of content to teens in areas like Reels and Explore, the new measures aim to further protect young users from potentially harmful content.

The introduction of these restrictions is part of Facebook’s ongoing efforts to create safe and age-appropriate experiences for young users. By limiting access to content related to self-harm and eating disorders, the platform aims to reduce the risk of exposure to potentially triggering material for vulnerable teens.