GUWAHATI: Meta Platforms announced new safety measures on Tuesday, April 8 aimed at enhancing protection for teens on its platforms. The updates include stricter parental controls for Instagram users under 16 and an expansion of existing safeguards to Facebook and Messenger.
ALSO READ: Three High-Speed Corridors To Be Unveiled In Assam
Under the new rules, teens under 16 will be barred from going live on Instagram or disabling the blur on images suspected to contain nudity in direct messages—unless they have parental consent. These measures build on Meta’s broader effort to address growing concerns over how social media impacts the mental health and well-being of young users.
Meta introduced its teen supervision tools for Instagram last September, offering parents more control over their children’s online activity. The company now plans to roll out the latest updates first in the U.S., U.K., Canada, and Australia, with a global release expected in the coming months.
In addition to the new Instagram policies, Meta is also bringing similar protections to Facebook and Messenger for users under 18. These include automatically setting teen accounts to private, restricting messages from unknown contacts, limiting exposure to sensitive content such as violent videos, issuing reminders to log off after an hour of usage, and muting notifications during bedtime hours.