Instagram, under increasing pressure to protect young users, is taking a significant step towards making its platform safer for teenagers. Starting this week in the US, UK, Canada, and Australia, all new Instagram accounts for users under 18 will automatically be set up as private, restricted teen accounts. Existing teen accounts will be transitioned to this new format over the next two months. The European Union will follow suit later this year.
Meta, Instagram’s parent company, acknowledges the possibility of teenagers misrepresenting their age and has stated that they will implement stricter age verification procedures, especially for new account creation. The company is also developing technology to proactively identify accounts that falsely claim to be adults and automatically switch them to restricted teen profiles.
These teen accounts will operate with a set of built-in safeguards. Privacy is prioritized, with accounts set to private by default, allowing only approved followers to see posts and content. Direct messages are restricted to ensure that teens only receive messages from those they follow or have existing connections with. Meta is also limiting access to ‘sensitive content,’ including videos of violence or content promoting cosmetic procedures.
To combat excessive screen time, teens will receive notifications if they’ve spent more than 60 minutes on Instagram. Additionally, a ‘sleep mode’ will be activated between 10 p.m. and 7 a.m., automatically disabling notifications and sending auto-replies to direct messages. While these features will be enabled for all teenagers, 16 and 17-year-olds will have the option to disable them. Users under 16 will require parental permission to adjust these settings.
Meta acknowledges the concerns voiced by parents regarding inappropriate content, unsolicited contact, and excessive screen time. The company emphasizes that these changes directly address these issues, empowering parents and teens to create a safer and more controlled online experience.
This move comes as Meta faces numerous lawsuits from US states accusing the company of intentionally designing its platforms, Instagram and Facebook, to create addictive habits among children, contributing to a mental health crisis among young people. Critics have previously voiced concerns that Meta’s past efforts to address teen safety and mental health haven’t been sufficient. While the new features introduce restrictions on screen time, teens still have the ability to bypass these limitations. However, parents can opt for ‘parental supervision’ mode, which allows them to set specific time limits, such as 15 minutes, for their children’s Instagram use.
With these new features, Meta aims to give parents greater control over their children’s accounts. Teens under 16 will need parental permission to modify settings to less restrictive options. This process involves setting up ‘parental supervision’ on their accounts and linking them to a parent or guardian.
Meta’s President of Global Affairs, Nick Clegg, recently highlighted the lack of parent engagement with existing parental controls. Naomi Gleit, Head of Product at Meta, believes that these new teen accounts will motivate parents to use parental supervision features. This will provide parents with insights into who their children are interacting with, including messages, followers, and those following their children. Gleit believes that this increased visibility will encourage parents to have conversations with their children about online safety and help them navigate challenging situations they may encounter online.
US Surgeon General Vivek Murthy has previously emphasized the burden placed on parents when it comes to safeguarding children on social media. He argues that parents are expected to manage a rapidly evolving technology that significantly influences their children’s self-perception, friendships, and worldviews, a responsibility that previous generations never faced.
Meta’s latest initiative represents a significant shift in its approach to teen safety on its platforms. The introduction of private teen accounts by default, coupled with enhanced parental control options, signals a renewed commitment to creating a safer online environment for young users. However, the effectiveness of these measures will ultimately depend on user adoption and the ongoing engagement of parents in overseeing their children’s digital experiences.