Instagram is revamping its platform for teenagers, introducing new features aimed at enhancing “built-in protections” for young users and providing additional oversight and reassurance for parents.
The updated “teen accounts” for users aged 13 to 15 will come with many privacy settings activated by default, eliminating the need for active opt-ins from the users. Posts from these accounts will be set to private, meaning they can only be seen by approved followers, and any new followers must be approved by the teen. These settings can only be altered with parental or guardian oversight or once the user turns 16.
This change comes amid global pressure on social media companies to enhance safety measures and protect young users from harmful content. The NSPCC, a UK children’s charity, welcomed the update as a “step in the right direction” but criticized Instagram’s owner, Meta, for placing the onus on children and parents to ensure safety. Rani Govender from NSPCC urged for more proactive measures to prevent harmful content and abuse.
The new features will roll out starting Tuesday in the UK, US, Canada, and Australia, with plans to introduce them to the EU later this year. Meta describes the update as a “new experience for teens, guided by parents,” aimed at supporting parental oversight and ensuring teen safety with enhanced protections.
However, media regulator Ofcom has expressed concerns about whether parents will effectively use these new controls. Senior Meta executive Sir Nick Clegg noted that even when such controls are provided, they are often not utilized by parents.
Ian Russell, whose daughter Molly encountered harmful content on Instagram before her tragic death, emphasized the need to evaluate the effectiveness of these measures once implemented. He stressed the importance of transparency and accountability from Meta.
The new teen accounts will include several default settings designed to enhance safety, such as stricter controls on sensitive content and muted notifications overnight. Accounts will automatically be set to private, requiring teens to approve new followers, and parents who choose to supervise will have access to information on their child’s messaging and interests, but not the content of the messages themselves.
Instagram plans to transition millions of existing teen accounts to this new setup within 60 days of announcing the changes. The system will largely rely on users being truthful about their ages, though Instagram will employ AI tools starting January in the US to detect and correct any discrepancies in age reporting.
The UK’s Online Safety Act, which mandates online platforms to ensure child safety or face significant fines, is prompting increased scrutiny of social media practices. Ofcom has warned that non-compliance could lead to public shaming and restrictions for under-18s.
Industry analyst Matt Navarra described these changes as significant but emphasized the importance of effective enforcement to prevent tech-savvy teens from bypassing safeguards.
Instagram is not the first platform to introduce such features; it has previously launched tools for parental oversight and content management. Similarly, Snapchat and YouTube have implemented measures to enhance safety for younger users.
Despite these efforts, there remains a critical need to address the persistent exposure of young people to harmful content. Ofcom’s recent study revealed that all children interviewed had encountered violent material online, with Instagram, WhatsApp, and Snapchat frequently cited.
While the Online Safety Act will require platforms to take action against illegal content, full compliance is expected only by 2025. In Australia, there are discussions about imposing a new age limit for social media use.
Instagram’s updated controls shift more responsibility to parents, who will need to manage their child’s Instagram experience more directly. Nevertheless, parents do not have control over Instagram’s algorithms or the vast content shared by its global user base.
Social media expert Paolo Pescatore called this an “important step” toward safeguarding children online and highlighted the need for continued improvements in digital wellbeing and parental control.