Social media ban for children under 16: What are the privacy implications and impact on the future?
The government has now passed legislation to ban any individual under the age of 16 from social media. The impact of the legislation will be felt by users, to parents, through to the social media platforms themselves.
The government stated that the legislation will not seek to impose penalties on users, but will instead place an onus on social media operators to show they are taking reasonable steps to prevent access by individuals under the age of 16. Australia’s online regulator, the eSafety Commissioner, will be the enforcement body for this legislation which will apply to social media platforms from the end of next year.
Understanding the flow on effects of the legislation and its impact on privacy and compliance is crucial to understanding the legislation itself. As Privacy law experts, parents and social media users themselves, Kelly Dickson and Eliza Sinclair discuss the legislation’s intent and how they see the Bill playing out for all parties involved.
Why is the social media ban being introduced?
The government argues that unregulated access to social media is having a negative impact on the mental health of children and young teenagers. The government has expressed its concern that young teenagers and children are being exposed to online content that would often not be considered appropriate or healthy for adults. The ban is intended to benefit children in the vulnerable age bracket, between 13-15 years old.
This vulnerable age bracket is where a child experiences major life and maturity changes. The thinking behind the ban originates from the idea that exposure to offensive, explicit, or degrading content online can be even more harmful at this vulnerable point in life.
The government’s stance is that social media operators have failed to implement effective changes to protect young people from exposure to harmful content, incentivising the government to take steps to protect the development of teenagers in these vulnerable years. Parents have also been calling for change, putting pressure on the government to act. A parent advocacy group gathered more than 125,000 signatures calling for the ban.
Who does the social media ban cover?
The legislation will ban all children under the age of 16 even if they already have a social media account. Children who already have an account will be removed. The ban will cover those children who have parental consent accounts, with the onus on social media platforms to show they are taking reasonable steps to ban children under the age of 16. Significantly for social media platforms, penalties for non-compliance can apply of at least $50 million, in line with Australian Competition and Consumer law penalties.
What platforms will it apply to?
The definition of social media will be adopted from the Online Safety Act, which currently defines social media as:
An electronic service that satisfies the following conditions:
- The sole or primary purpose of the service is to enable online social interaction between two or more end users.
- The service allows end users to link to, or interact with, some or all of the other end users.
- The service allows end users to post material on the service.
The definition of social media utilised is very broad and may encapsulate a wide range of platforms. The platforms to be covered are yet to be specifically listed, however large platforms such as Instagram, TikTok, Snapchat and X (Twitter) will be captured, whilst gaming and messaging platforms, or any sites that can be accessed without an account (e.g. YouTube), will be exempt.
What changes can be expected?
The main changes expected under the legislation include:
- Responsibility on social media companies to take reasonable steps to block individuals under the age of 16;
- What is considered “reasonable steps” is likely to be set out in regulatory guidance;
- Enforcement responsibilities and regulation to be bestowed upon the eSafety Commissioner; and
- Penalties for those platforms that fail to comply with the legislation.
Issues for children and parents
There are many issues apparent from the information currently provided. For example, there could be reasons why a young teenager under the age of 16 may need independent access to social media.
Many teenagers hold part-time jobs and have started thinking about their future careers. Social media enables them to engage with education institutions, potential employers, and networks of people independently.
Many teenagers are also discovering their identity and their place in wider society. Online communities can often provide support to those facing these questions when they may not have the appropriate support available at home.
Some parents have raised concerns that this law will impede on the way they choose to raise their children, and that a blanket ban strips them of their right to provide their child consent to use social media.
Issues for platform operators
Banning existing users under the age of 16 presents a technological nightmare for social media operators. To exclude all people under the age of 16, they will have to review all social media accounts to ensure that users verify their age. The verification process can also raise a lot of privacy and data breach concerns.
As lawyers practicing Privacy law, we know that it’s not a matter of whether issues in relation to privacy will arise, it’s when, as methods for verifying the individual’s age can commonly include ID verification or biometric data.
Social media platforms have argued that this verification should be placed on the app stores, rather than the individual platforms, meaning the individual would only be required to confirm their age once. It is also possible that third party platforms may be involved for verification purposes.
The government has confirmed that no matter what verification processes are adopted, privacy protections will also be introduced to cover any data individuals will end up providing. These platforms, to a degree, are already collecting personal information on children without meaningful oversight, and regulation could allow controls and force these platforms to be transparent about how they handle the data of young users.
In addition to the above, social media platforms have also argued that a preferred approach for parents may be to implement social media controls to create age-appropriate spaces, rather than a blanket ban. The government has made comment that if it does not get involved, it is effectively leaving it to tech companies to determine what is acceptable.
Where to next?
The legislation will come into effect at the end of 2025 to allow social media platforms a grace period of 12 months to prepare for the change.
With the recent passing of the Privacy and Other Legislation Amendment Bill 2024 (Cth), the Privacy Commissioner is mandated to create a Code in relation to the use of children’s data, which may relate back to the collection of data for social media accounts.
As so much continues to unfold in the Privacy law space, we recommend that social media platforms or platforms that meet the definition of social media outside of the large well-known platforms, should continue to stay informed with further updates and communications around children’s privacy rights as information becomes available.
Macpherson Kelley’s Privacy law team will continue to provide a legal lens to the upcoming changes, so please reach out, if any of the above raised questions for you.
The information contained in this article is general in nature and cannot be relied on as legal advice nor does it create an engagement. Please contact one of our lawyers listed above for advice about your specific situation.
more
insights
Professional Services face extra compliance requirements as Anti-Money Laundering and Counter-Terrorism Financing Regime gets green light
AI adoption in business: Unveiling the Senate’s blueprint for regulation
New Franchising Code of Conduct now legislation – are you prepared?
stay up to date with our news & insights
Social media ban for children under 16: What are the privacy implications and impact on the future?
The government has now passed legislation to ban any individual under the age of 16 from social media. The impact of the legislation will be felt by users, to parents, through to the social media platforms themselves.
The government stated that the legislation will not seek to impose penalties on users, but will instead place an onus on social media operators to show they are taking reasonable steps to prevent access by individuals under the age of 16. Australia’s online regulator, the eSafety Commissioner, will be the enforcement body for this legislation which will apply to social media platforms from the end of next year.
Understanding the flow on effects of the legislation and its impact on privacy and compliance is crucial to understanding the legislation itself. As Privacy law experts, parents and social media users themselves, Kelly Dickson and Eliza Sinclair discuss the legislation’s intent and how they see the Bill playing out for all parties involved.
Why is the social media ban being introduced?
The government argues that unregulated access to social media is having a negative impact on the mental health of children and young teenagers. The government has expressed its concern that young teenagers and children are being exposed to online content that would often not be considered appropriate or healthy for adults. The ban is intended to benefit children in the vulnerable age bracket, between 13-15 years old.
This vulnerable age bracket is where a child experiences major life and maturity changes. The thinking behind the ban originates from the idea that exposure to offensive, explicit, or degrading content online can be even more harmful at this vulnerable point in life.
The government’s stance is that social media operators have failed to implement effective changes to protect young people from exposure to harmful content, incentivising the government to take steps to protect the development of teenagers in these vulnerable years. Parents have also been calling for change, putting pressure on the government to act. A parent advocacy group gathered more than 125,000 signatures calling for the ban.
Who does the social media ban cover?
The legislation will ban all children under the age of 16 even if they already have a social media account. Children who already have an account will be removed. The ban will cover those children who have parental consent accounts, with the onus on social media platforms to show they are taking reasonable steps to ban children under the age of 16. Significantly for social media platforms, penalties for non-compliance can apply of at least $50 million, in line with Australian Competition and Consumer law penalties.
What platforms will it apply to?
The definition of social media will be adopted from the Online Safety Act, which currently defines social media as:
An electronic service that satisfies the following conditions:
- The sole or primary purpose of the service is to enable online social interaction between two or more end users.
- The service allows end users to link to, or interact with, some or all of the other end users.
- The service allows end users to post material on the service.
The definition of social media utilised is very broad and may encapsulate a wide range of platforms. The platforms to be covered are yet to be specifically listed, however large platforms such as Instagram, TikTok, Snapchat and X (Twitter) will be captured, whilst gaming and messaging platforms, or any sites that can be accessed without an account (e.g. YouTube), will be exempt.
What changes can be expected?
The main changes expected under the legislation include:
- Responsibility on social media companies to take reasonable steps to block individuals under the age of 16;
- What is considered “reasonable steps” is likely to be set out in regulatory guidance;
- Enforcement responsibilities and regulation to be bestowed upon the eSafety Commissioner; and
- Penalties for those platforms that fail to comply with the legislation.
Issues for children and parents
There are many issues apparent from the information currently provided. For example, there could be reasons why a young teenager under the age of 16 may need independent access to social media.
Many teenagers hold part-time jobs and have started thinking about their future careers. Social media enables them to engage with education institutions, potential employers, and networks of people independently.
Many teenagers are also discovering their identity and their place in wider society. Online communities can often provide support to those facing these questions when they may not have the appropriate support available at home.
Some parents have raised concerns that this law will impede on the way they choose to raise their children, and that a blanket ban strips them of their right to provide their child consent to use social media.
Issues for platform operators
Banning existing users under the age of 16 presents a technological nightmare for social media operators. To exclude all people under the age of 16, they will have to review all social media accounts to ensure that users verify their age. The verification process can also raise a lot of privacy and data breach concerns.
As lawyers practicing Privacy law, we know that it’s not a matter of whether issues in relation to privacy will arise, it’s when, as methods for verifying the individual’s age can commonly include ID verification or biometric data.
Social media platforms have argued that this verification should be placed on the app stores, rather than the individual platforms, meaning the individual would only be required to confirm their age once. It is also possible that third party platforms may be involved for verification purposes.
The government has confirmed that no matter what verification processes are adopted, privacy protections will also be introduced to cover any data individuals will end up providing. These platforms, to a degree, are already collecting personal information on children without meaningful oversight, and regulation could allow controls and force these platforms to be transparent about how they handle the data of young users.
In addition to the above, social media platforms have also argued that a preferred approach for parents may be to implement social media controls to create age-appropriate spaces, rather than a blanket ban. The government has made comment that if it does not get involved, it is effectively leaving it to tech companies to determine what is acceptable.
Where to next?
The legislation will come into effect at the end of 2025 to allow social media platforms a grace period of 12 months to prepare for the change.
With the recent passing of the Privacy and Other Legislation Amendment Bill 2024 (Cth), the Privacy Commissioner is mandated to create a Code in relation to the use of children’s data, which may relate back to the collection of data for social media accounts.
As so much continues to unfold in the Privacy law space, we recommend that social media platforms or platforms that meet the definition of social media outside of the large well-known platforms, should continue to stay informed with further updates and communications around children’s privacy rights as information becomes available.
Macpherson Kelley’s Privacy law team will continue to provide a legal lens to the upcoming changes, so please reach out, if any of the above raised questions for you.