Australia is introducing significant reforms to regulate social media access for children through an amendment to the Online Safety Act 2021 (Cth). The Social Media Minimum Age (SMMA) regime[1], will take effect from 10 December 2025 (the compliance date). From the compliance date, platforms classified as age-restricted social media platforms (ARSMPs) will be required to prevent users under the age of 16 years from creating or maintaining accounts on the platforms.
Key takeaways
- Significant civil penalties for breaches – up to $9.9 million for individuals and $49.5 million for companies, underscoring the seriousness of harms the SMMA regime seeks to prevent, as these obligations are designed to protect children. In light of this, ARSMPs should ensure they are prepared to comply from the compliance date.
- ARSMPs whose primary function is to facilitate social interaction and share content will be mandated to take “reasonable steps” to prevent individuals under 16 years of age from accessing their platforms.
- “Reasonable steps” may include, but are not limited to, measures such as verifying the age of new users through trusted third-party systems, reviewing existing accounts to identify potential underage users and implementing technological solutions or processes designed to assess or confirm user age.
- It is imperative that the reasonable steps taken by ARSMPs also comply with existing privacy law obligations. The eSafety Commissioner has provided guidance on how this can be done.
What is the SMMA regime?
The incoming SMMA regime is one of the most significant regulatory developments in recent years to safeguard children online – globally. The law applies to ARSMPs, being platforms whose primary function is to facilitate social interaction, allowing users to share content, connect with others, or communicate publicly or semi-publicly. Not all online services fall under the SMMA regime, with messaging-only applications, educational platforms and gaming networks without social features generally sitting outside the scope of the regime at this stage. However, platforms with hybrid or social features may still be captured depending on their functionality.
From the compliance date, all ARSMPs must demonstrate that they have implemented effective measures to prevent under-16s from accessing their services. This obligation applies to both new accounts and existing accounts created prior to the commencement of the SMMA regime. Failure to comply with the requirements of the SMMA regime will constitute a civil penalty and may result in significant monetary penalties. This underscores the need for thorough preparation and following a structured, proactive and fit-for-purpose compliance strategy. A key consideration for ARSMPs will be what constitutes “reasonable steps” in practice to meet this obligation.
Breaches of the SMMA regime carry substantial financial consequences, reflecting the gravity of the harms these obligations aim to prevent. Penalties can reach up to $9.9 million for individuals and $49.5 million for companies. These figures underscore the legislative intent to ensure robust protections for children in the digital environment.
Given the scale of potential liability, ARSMPs should act now to achieve full compliance by the designated date. This includes reviewing internal processes, implementing safeguards, and aligning governance frameworks with the regime’s requirements. Non-compliance risks not only severe penalties but also reputational damage and loss of stakeholder trust.
This reform is high on the political agenda and has attracted global attention, including discussion at the UN General Assembly in September 2025. With its strong focus on child safety, regulatory enforcement is expected to commence immediately once the SMMA regime takes effect.
Applying the “reasonable steps” test under the eSafety Guidance
Under Part 4A of the Online Safety Act, ARSMPs are required to implement practical, effective and proportionate controls to prevent users under the age of 16 from accessing their services. While the law does not prescribe specific measures, the “reasonable steps” test is explained in detail through the SMMA Regulatory Guidance (eSafety Guidance) issued by the eSafety Commissioner, who is responsible for administering and enforcing the Online Safety Act. The eSafety Guidance, published on 16 September 2025, is intended to steer ARSMPs toward compliance, providing a framework for what effective, proportionate and risk-based controls look like in practice.
Reasonable steps may include:
- verifying the age of new users through trusted third-party systems, including age-verification and geolocation technologies
- reviewing existing accounts to identify underage users
- deploying technological solutions and processes to assess or confirm user age.
Any personal information collected for the purposes of compliance under the SMMA regime must be handled in accordance with privacy obligations, used solely for age verification and disposed of appropriately once no longer required.
Compliance of the ARSMPs will be assessed on outcomes rather than formalities, and must be able to demonstrate that their measures are effective in practice, including:
- monitoring accounts for indications of underage use
- preventing underage users from creating new accounts
- providing mechanisms for reporting suspected underage activity
- maintaining documentation and records of procedures and safeguards in place.
Regulators are likely to consider whether the steps are sufficient to prevent under-16 access, proportionate to each ARSMP’s scale and risk profile, and aligned with privacy and data protection requirements.
The Office of the Australian Information Commissioner (OAIC) has recommended that ARSMPs use a person’s information that is already available to them, including phone numbers, device identifiers and IP addresses to draw inferences as to the age of a user of an ARSMP. They can also draw relevant information such as how long the account has been on the platform for, the type of content the ARSMP engages with, and analysis on linguistic conventions.
The road ahead: safety and privacy in balance
Australia’s approach is drawing global attention as the first law of its kind to impose a minimum age requirement for social media platforms. Governments, regulators and industry stakeholders will be observing closely to see how the framework is applied in practice and whether it sets a precedent for managing age-related risks while balancing privacy and operational demands. ARSMPs will need to ensure that any age verification measures are consistent with privacy laws and recent OAIC guidance, particularly around data minimisation, purpose limitation and secure disposal of personal information.
ARSMPs are already preparing for the reforms, with Meta announcing in early December that it has begun locking children out of their accounts, while other platforms are issuing email prompts and notifications to users about future access. While some parties have raised legal or human rights concerns about the validity of the SMMA, the government has confirmed that implementation will proceed as scheduled and that enforcement will begin as planned once the requirements take effect.
The introduction of the SMMA regime represents a significant regulatory shift. By centring the law on the “reasonable steps” requirement, it establishes a clear expectation that ARSMPs must take deliberate, evidence-based measures to protect children. Compliance will demand not only technical solutions, but also rigorous governance, accountability and an ongoing commitment to oversight and improvement. Critically, these measures must be designed to balance online safety objectives with privacy obligations, ensuring that age verification processes do not create unnecessary risks or lead to privacy breaches. In doing so, the SMMA regime sets a new benchmark for how digital platforms must manage age-related risks, signalling a significant evolution in the regulation of online safety. Given the scale and resources of ARSMPs, regulators will expect full compliance – and the eSafety Commissioner has indicated it will enforce accordingly.
[1] The SMMA regime is introduced through the Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth).