News
Meta expands restrictions for teen users to Facebook and Messenger

Meta, according to online media reports of Tuesday, April 8, 2025, is expanding Teen Accounts.
It considers this move as its age-appropriate experience for under 18s, to Facebook and Messenger.
The system reportedly involves putting younger teens on the platforms into more restricted settings by default.
This comes with parental permission required in order to live stream or turn off image protections for messages.
It was first introduced last September on Instagram, which Meta says “fundamentally changed the experience for teens” on the platform.
But campaigners say it’s unclear what difference Teen Accounts has actually made.
Chief executive of the Molly Rose Foundation. Andy Burrows, said:
“Eight months after Meta rolled out Teen Accounts on Instagram, we’ve had silence from Mark Zuckerberg about whether this has actually been effective and even what sensitive content it actually tackles.”
He added that it was “appalling” that parents still did not know whether the settings prevented their children being “algorithmically recommended” inappropriate or harmful content.
But Drew Benvie, chief executive of social media consultancy Battenhall, said it was a step in the right direction.
“For once, big social are fighting for the leadership position not for the most highly engaged teen user base, but for the safest,” he said.
However he also pointed out there was a risk, as with all platforms, that teens could “find a way around safety settings.”
The expanded roll-out of Teen Accounts is beginning in the UK, US, Australia and Canada from Tuesday.
Companies that provide services popular with children have faced pressure to introduce parental controls or safety mechanisms to safeguard their experiences.
In the UK, they also face legal requirements to prevent children from encountering harmful and illegal content on their platforms, under the Online Safety Act.
Meta’s expansion of safety features for teens comes as some lawmakers say they plan to press ahead with proposed legislation.
An example of such legislations is the Kids Online Safety Act (KOSA), seeking to protect children from social media harms.
Meta, ByteDance’s TikTok and Google’s (GOOGL.O), YouTube already face hundreds of lawsuits filed on behalf of children and school districts about the addictive nature of social media.
In 2023, 33 U.S. states including California and New York sued the company for misleading the public about the dangers of its platforms.
“We will start including these updates in the next couple of months,” the company said.
In July 2024, the U.S. Senate advanced two online safety bills – KOSA and The Children and Teens’ Online Privacy Protection Act.
Analysts say these bills would force social media companies to take responsibility for how their platforms affect children and teens.
The Republican-led House declined to bring KOSA up for a vote last year.
However, it suggested at a committee hearing late last month that they still plan to press ahead with new laws to protect kids online.
Top platforms, including Facebook, Instagram and TikTok, allow users who are 13 years of age and above to sign up.
For Diaspora Digital Media Updates click on Whatsapp, or Telegram. For eyewitness accounts/ reports/ articles, write to: citizenreports@diasporadigitalmedia.com. Follow us on X (Fomerly Twitter) or Facebook