Meta has blocked approximately 550,000 accounts during the initial days of Australia's new law that prohibits social media access for individuals under 16. The law, which came into effect in December, requires platforms like Instagram and Facebook to ensure that users of this age cannot create accounts.
The ban has been framed by supporters, including government officials and child protection advocates, as a necessary step to safeguard minors from harmful content and the negative implications of social media algorithms. However, Meta, the parent company of these platforms, has expressed the need for constructive engagement with the Australian government regarding age verification methods.
In its compliance efforts, Meta reported that 330,639 accounts were blocked on Instagram, 173,497 on Facebook, and 39,916 on Threads during the first week of the law’s enforcement. The company suggests that age verification processes should occur at the app store level, which they believe would facilitate compliance and enhance protections for youth users.
While the new legislation has garnered significant support from parents and is being studied by other countries, experts have raised alarms about its efficacy. Critics argue that children may easily bypass age verification filters, potentially drawing them to alternative, unsupervised online environments. Many young individuals, particularly those in marginalized communities, fear the loss of essential social connections due to the ban.
This landmark decision makes Australia the first country to implement such stringent age restrictions on social media without allowing parental exemptions, sparking similar discussions across the globe about finding a balance between online safety and youth connectivity.





















