contact our team Search Search
brisbane

one eagle – waterfront brisbane
level 30, 1 eagle street
brisbane qld 4000
+61 7 3235 0400

 

dandenong

40-42 scott st,
dandenong vic 3175
+61 3 9794 2600

 

melbourne

level 7, 600 bourke st,
melbourne vic 3000
+61 3 8615 9900

 

sydney

grosvenor place
level 11, 225 george st,
sydney nsw 2000
+61 2 8298 9533

 

adelaide

naylor house
3/191 pulteney st,
adelaide sa 5000
+61 8 8451 6900

 

hello. we’re glad you’re
getting in touch.

Fill in form below, or simply call us on 1800 888 966

 

 

Automated Decision-Making: Current privacy obligations and what’s in the pipeline for 2026

22 January 2026
Mark Metzeling
Read Time 5 mins reading time

The accelerating integration of autonomous systems into corporate value chains is transforming how business decisions are made, data are moved, and how value is created. On 10 December 2026, the Privacy and Other Legislation Amendment Act 2024 (Act) will introduce mandatory transparency duties for Australian Privacy Principle (APP) entities that rely on computer programs to make, or substantially assist in making, decisions affecting individuals.

This is set to recalibrate board-level accountability and reshape the compliance landscape for every enterprise deploying robotics, machine learning, or algorithmic control.

The rise of autonomous enterprise operations

Robotics, control systems and artificial intelligence (AI) applications have moved from peripheral pilots to the operational core of sectors as diverse as mining, logistics, healthcare and financial services. Mining majors in Western Australia now run fleets of driverless haul trucks, while hospitals deploy collaborative robots for surgery and patient handling. These systems generate and exchange torrents of sensor data, operational metrics and personal information that traverse on-premise servers, edge devices and hyperscale clouds. For business, the attraction is twofold. Automation promises higher productivity, and richer data for strategic decision-making. Yet the very attributes that promise a competitive advantage in scale, speed and adaptive autonomy, also amplify legal exposure.

The current Australian legal framework

The main federal legislation relating to automated decision-making is the Privacy Act 1988 (Cth) (Privacy Act), and the Australian Consumer Law (ACL), which primarily serve to protect privacy and data, and misleading or deceptive conduct, respectively.

Privacy and data protection obligations

Australia’s Privacy Act, as supplemented by the Australian Privacy Principles (APPs), remains the primary source of data protection duties. APP 6 confines the use and disclosure of personal information to purposes reasonably contemplated at collection, imposing a “use compatibility” test that robotic deployments must respect when they repurpose operational data. APP 11 requires organisations to implement security measures that are reasonable in the circumstances, which in practice means technical measures (e.g. encryption at rest and in transit, role-based access controls, penetration testing and supplier assurance where robotic data is relayed to cloud services, and regular security testing) and organisational measures (e.g. staff training, implementation of privacy policies, and incident response plans).

This isn’t just a one off event, entities are required pursuant to the Privacy Act to regularly review and update their security measures to ensure they remain reasonable in light of evolving risks and technology.

Consumer protection and algorithmic transparency

Section 18 of the ACL prohibits misleading or deceptive conduct in trade or commerce. The Australian Competition and Consumer Commission (ACCC) has already applied this provision to algorithmic decision-making, as demonstrated in 2022 when Trivago, an online travel platform, was fined for ranking results in a manner that misled consumers about price. Any robotic or AI system that personalises offers, credit limits or insurance premiums must therefore be explainable and empirically justified.

Additional penalty notices and court proceedings have been issued/pursued by the ACCC in relation to non-transparent automated decision-making on the grounds that it is misleading or deceptive conduct. It is considered that this action by the ACCC is a precursor to the commencement of more onerous obligations placed on businesses relating to automated decision-making (coming into effect from 10 December 2026), enabling enforcement by the ACCC and the Office of the Australian Information Commissioner (OAIC).

Defining automated decision

In accordance with the Privacy Act, an automated decision is any outcome where a computer program either:

  • makes a decision itself; or
  • performs a task “substantially and directly related” to making the decision; and
  • the decision “could reasonably be expected to significantly affect the rights or interests of an individual”.

The legislation makes clear that a “decision” includes failure to decide, and that an effect may be adverse or beneficial. Illustrative examples in the Privacy Act include granting or refusing statutory benefits, determining contractual rights, or controlling access to significant services.

The 2026 Australian reforms on Automated Decision-Making

From 10 December 2026, any business that is an APP entity and that uses personal information in automated or semi-automated decision-making, that could reasonably be expected to have a significant effect on an individual’s rights or interests, must expand its APP 1 privacy policy to describe the data it uses and the types of decisions it takes.

Failure to comply will expose organisations to the Privacy Act’s civil penalty regime, reputational damage and heightened regulatory scrutiny. Non-compliance with the Privacy Act could result in fines of $62,600 per offence (and significantly more up to the larger of $50 million, 3 times the benefit obtained, or 30% of turnover, for serious interference with privacy).  Accordingly, preparing early, by mapping algorithms, evaluating impacts, and embedding governance, will be essential for boards and executives who wish to mitigate risk, preserve stakeholder trust, and avoid these significant fines.

Legislative context and scope of the 2026 Privacy reforms

It is important to note that the amendments apply prospectively to any decision made on or after 10 December 2026, irrespective of whether the underlying algorithm, data collection or deployment arrangements were in place beforehand. Accordingly, it is advisable to ensure your existing systems are compliant well before this date.

Mandatory transparency duties: Who is covered?

As alluded to above, the duties bind all “APP entities” under the Privacy Act being:

  • Australian government agencies;
  • Foreign corporations carrying on business in Australia and collecting or holding Australians’ personal information;
  • Private-sector organisations with annual turnover exceeding A$3 million; and
  • Businesses with an annual turnover of less than A$3 million, but
    • trade in personal information; or
    • fall within existing Privacy Act designations (health-service providers, credit reporting bodies, Consumer Data Right participants and others).

The following businesses are most likely to be affected by the new disclosure requirement:

  • Financial services firms as they rely heavily on credit-scoring, fraud detection and algorithmic trading.
  • E-commerce and digital-platform operators that deploy recommender systems, or dynamic pricing, which influence consumer choice and access.
  • Telecommunications providers that automate customer identity checks and service provisioning.
  • Insurers, health-tech and MedTech companies that utilise predictive analytics to determine eligibility and pricing.
  • Recruitment, gig-economy, and HRtech platforms that apply algorithmic screening that directly shapes employment opportunities.
  • Energy and utilities that automate billing, implement hardship assessments, and disconnections.
  • Any other enterprise with large-scale data analytics and AI-driven decisioning that affects customer entitlements, pricing, or access.

Engage with a Privacy lawyer early

Noting the incoming transparency duties on your calendar is just the first step. As our Privacy lawyers will discuss in a later article, there are significant Operational, legal and financial implications of the incoming obligations that will take time to address prior to December 2026.

Engaging a Privacy lawyer who is across all the incoming changes from the outset can potentially save your organisation the hefty cost and reputational damage that comes with getting caught up in litigation, or receiving a fine from the OAIC. Macpherson Kelley’s Privacy team can provide risk analysis and strategic recommendations for compliance readiness, ahead of the October deadline. Get in touch with our talented team today.

 

 

The information contained in this article is general in nature and cannot be relied on as legal advice nor does it create an engagement. Please contact one of our lawyers listed above for advice about your specific situation.

stay up to date with our news & insights

 

Automated Decision-Making: Current privacy obligations and what’s in the pipeline for 2026

22 January 2026
Mark Metzeling

The accelerating integration of autonomous systems into corporate value chains is transforming how business decisions are made, data are moved, and how value is created. On 10 December 2026, the Privacy and Other Legislation Amendment Act 2024 (Act) will introduce mandatory transparency duties for Australian Privacy Principle (APP) entities that rely on computer programs to make, or substantially assist in making, decisions affecting individuals.

This is set to recalibrate board-level accountability and reshape the compliance landscape for every enterprise deploying robotics, machine learning, or algorithmic control.

The rise of autonomous enterprise operations

Robotics, control systems and artificial intelligence (AI) applications have moved from peripheral pilots to the operational core of sectors as diverse as mining, logistics, healthcare and financial services. Mining majors in Western Australia now run fleets of driverless haul trucks, while hospitals deploy collaborative robots for surgery and patient handling. These systems generate and exchange torrents of sensor data, operational metrics and personal information that traverse on-premise servers, edge devices and hyperscale clouds. For business, the attraction is twofold. Automation promises higher productivity, and richer data for strategic decision-making. Yet the very attributes that promise a competitive advantage in scale, speed and adaptive autonomy, also amplify legal exposure.

The current Australian legal framework

The main federal legislation relating to automated decision-making is the Privacy Act 1988 (Cth) (Privacy Act), and the Australian Consumer Law (ACL), which primarily serve to protect privacy and data, and misleading or deceptive conduct, respectively.

Privacy and data protection obligations

Australia’s Privacy Act, as supplemented by the Australian Privacy Principles (APPs), remains the primary source of data protection duties. APP 6 confines the use and disclosure of personal information to purposes reasonably contemplated at collection, imposing a “use compatibility” test that robotic deployments must respect when they repurpose operational data. APP 11 requires organisations to implement security measures that are reasonable in the circumstances, which in practice means technical measures (e.g. encryption at rest and in transit, role-based access controls, penetration testing and supplier assurance where robotic data is relayed to cloud services, and regular security testing) and organisational measures (e.g. staff training, implementation of privacy policies, and incident response plans).

This isn’t just a one off event, entities are required pursuant to the Privacy Act to regularly review and update their security measures to ensure they remain reasonable in light of evolving risks and technology.

Consumer protection and algorithmic transparency

Section 18 of the ACL prohibits misleading or deceptive conduct in trade or commerce. The Australian Competition and Consumer Commission (ACCC) has already applied this provision to algorithmic decision-making, as demonstrated in 2022 when Trivago, an online travel platform, was fined for ranking results in a manner that misled consumers about price. Any robotic or AI system that personalises offers, credit limits or insurance premiums must therefore be explainable and empirically justified.

Additional penalty notices and court proceedings have been issued/pursued by the ACCC in relation to non-transparent automated decision-making on the grounds that it is misleading or deceptive conduct. It is considered that this action by the ACCC is a precursor to the commencement of more onerous obligations placed on businesses relating to automated decision-making (coming into effect from 10 December 2026), enabling enforcement by the ACCC and the Office of the Australian Information Commissioner (OAIC).

Defining automated decision

In accordance with the Privacy Act, an automated decision is any outcome where a computer program either:

  • makes a decision itself; or
  • performs a task “substantially and directly related” to making the decision; and
  • the decision “could reasonably be expected to significantly affect the rights or interests of an individual”.

The legislation makes clear that a “decision” includes failure to decide, and that an effect may be adverse or beneficial. Illustrative examples in the Privacy Act include granting or refusing statutory benefits, determining contractual rights, or controlling access to significant services.

The 2026 Australian reforms on Automated Decision-Making

From 10 December 2026, any business that is an APP entity and that uses personal information in automated or semi-automated decision-making, that could reasonably be expected to have a significant effect on an individual’s rights or interests, must expand its APP 1 privacy policy to describe the data it uses and the types of decisions it takes.

Failure to comply will expose organisations to the Privacy Act’s civil penalty regime, reputational damage and heightened regulatory scrutiny. Non-compliance with the Privacy Act could result in fines of $62,600 per offence (and significantly more up to the larger of $50 million, 3 times the benefit obtained, or 30% of turnover, for serious interference with privacy).  Accordingly, preparing early, by mapping algorithms, evaluating impacts, and embedding governance, will be essential for boards and executives who wish to mitigate risk, preserve stakeholder trust, and avoid these significant fines.

Legislative context and scope of the 2026 Privacy reforms

It is important to note that the amendments apply prospectively to any decision made on or after 10 December 2026, irrespective of whether the underlying algorithm, data collection or deployment arrangements were in place beforehand. Accordingly, it is advisable to ensure your existing systems are compliant well before this date.

Mandatory transparency duties: Who is covered?

As alluded to above, the duties bind all “APP entities” under the Privacy Act being:

  • Australian government agencies;
  • Foreign corporations carrying on business in Australia and collecting or holding Australians’ personal information;
  • Private-sector organisations with annual turnover exceeding A$3 million; and
  • Businesses with an annual turnover of less than A$3 million, but
    • trade in personal information; or
    • fall within existing Privacy Act designations (health-service providers, credit reporting bodies, Consumer Data Right participants and others).

The following businesses are most likely to be affected by the new disclosure requirement:

  • Financial services firms as they rely heavily on credit-scoring, fraud detection and algorithmic trading.
  • E-commerce and digital-platform operators that deploy recommender systems, or dynamic pricing, which influence consumer choice and access.
  • Telecommunications providers that automate customer identity checks and service provisioning.
  • Insurers, health-tech and MedTech companies that utilise predictive analytics to determine eligibility and pricing.
  • Recruitment, gig-economy, and HRtech platforms that apply algorithmic screening that directly shapes employment opportunities.
  • Energy and utilities that automate billing, implement hardship assessments, and disconnections.
  • Any other enterprise with large-scale data analytics and AI-driven decisioning that affects customer entitlements, pricing, or access.

Engage with a Privacy lawyer early

Noting the incoming transparency duties on your calendar is just the first step. As our Privacy lawyers will discuss in a later article, there are significant Operational, legal and financial implications of the incoming obligations that will take time to address prior to December 2026.

Engaging a Privacy lawyer who is across all the incoming changes from the outset can potentially save your organisation the hefty cost and reputational damage that comes with getting caught up in litigation, or receiving a fine from the OAIC. Macpherson Kelley’s Privacy team can provide risk analysis and strategic recommendations for compliance readiness, ahead of the October deadline. Get in touch with our talented team today.