contact our team Search Search
brisbane

one eagle – waterfront brisbane
level 30, 1 eagle street
brisbane qld 4000
+61 7 3235 0400

 

dandenong

40-42 scott st,
dandenong vic 3175
+61 3 9794 2600

 

melbourne

level 7, 600 bourke st,
melbourne vic 3000
+61 3 8615 9900

 

sydney

grosvenor place
level 11, 225 george st,
sydney nsw 2000
+61 2 8298 9533

 

adelaide

naylor house
3/191 pulteney st,
adelaide sa 5000
+61 8 8451 6900

 

hello. we’re glad you’re
getting in touch.

Fill in form below, or simply call us on 1800 888 966

 

 

Automated Decision Making: Business obligations, risks and strategic responses to new reforms

12 February 2026
Mark Metzeling
Read Time 7 mins reading time

New privacy reforms are on the horizon in Australia, introducing additional obligations around Automated Decision-Making.

As discussed in an earlier insight, from 10 December 2026, any business that is an Australian Privacy Principle (APP) entity that uses personal information in automated, or semi-automated, decision-making, that could reasonably be expected to have a significant effect on an individual’s rights or interests, must expand its APP 1 privacy policy to describe the data it uses and the types of decisions it takes.

Failure to comply will expose organisations to the Privacy Act’s civil penalty regime, reputational damage and heightened regulatory scrutiny.

While we’ve already covered the current requirements and who will be covered by the incoming regime, our Privacy lawyers will now discuss in greater detail the likely operational and financial implications, key compliance challenges, and practical steps for achieving readiness by the 10 December 2026 commencement date.

Mandatory privacy-policy disclosure

As stipulated on the Office of the Australian Information Commissioner (OAIC) website, an APP entity is required to have a clearly expressed and up-to-date APP Privacy Policy.

The policy should, at a minimum:

  • easy to understand
  • easy to navigate, and
  • only include information that is relevant to the management of personal information by the entity about how it manages personal information.

From 10 December 2026, additional information must be provided. Every covered organisation must update its APP 1 privacy policy to set out, in plain language:

  • The kinds of personal information used in the operation of such automated decision-making platforms and computer programs.
  • The kinds of such decisions made solely by the operation of such computer programs.
  • The kinds of such decisions for which a thing, that is substantially and directly related to making the decision, is done by the operation of such computer programs.

Ongoing accuracy and currency

Because APP 1 requires policies to be kept up to date, businesses must establish processes to maintain accurate inventories of Automated Decision Systems (ADS) and the data they consume. Iterative model updates or new data sources will trigger policy changes. Accordingly, it’s advisable to update your privacy policy whenever:

  • there is an update of an existing ADS,
  • you commence use of a new ADS,
  • you utilise a new data source in an existing or new ADS, or
  • you use your ADS for new purposes or applications.

Given the rate at which this technology is moving, from a practical perspective, it is likely an update will be required to your privacy policy at least every 6 months.

Regulatory enforcement

Non-compliance is an “interference with privacy”, subject to OAIC investigation and civil penalties. Since December 2022, maximum penalties for serious or repeated interferences have been increased to the greater of:

  1. A$50 million,
  2. three times the benefit obtained, or
  3. 30 per cent of adjusted turnover.

It is noted that a failure to incorporate accurate ADS disclosures may readily be characterised as “serious” where automated decisions materially affect individuals. Accordingly, substantial penalties will apply if ADS disclosures aren’t made.

Furthermore, amendments to the Australian Consumer Law (ACL) will introduce a rebuttable presumption that harm caused by an autonomous product arose from a safety defect unless the supplier can prove otherwise. This flips the existing burden of proof and incentivises proactive safety engineering. This will be very important for businesses that sell products that utilise ADS in their operation, such as automobiles, robotics, remote access, etc.

Operational, legal and financial implications

Operational impacts

Compliance will have a fundamental impact on your operations, requiring changes and extra documentation. Below are some practical tips for businesses using ADM systems.

  1. Map all existing and planned ADS, including machine-learning classification, scoring, profiling and recommendation engines.
  2. Catalogue personal-information inputs, purpose, outputs and decision pathways;
  3. implement change-management to ensure privacy policy updates track model iteration.
  4. Coordinate legal, data-science, privacy and product teams to produce intelligible disclosures that balance clarity with protection of intellectual property and security.
  5. Embed governance to evaluate whether an automated decision is likely to have a “significant effect” on individuals, triggering disclosure.

Legal exposure

The OAIC is expected to prioritise enforcement because the reform squarely targets opaque algorithmic processing (which is also a priority target for the ACCC). If not compliant, businesses risk:

  • OAIC or ACCC investigations, enforceable undertakings and determinations,
  • civil penalty proceedings under Part VIB,
  • representative actions or individual claims for privacy interference, and
  • regulatory overlap with sectoral regimes (e.g. ASIC, APRA, ACCC) where ADS outcomes breach financial-services, consumer, or anti-discrimination laws.

It is important to note that it isn’t uncommon for an OAIC or ACCC investigation to cost in excess of A$1 million in legal fees, not to mention the time and resources it takes away from your business in complying with the same. Accordingly, we recommend advance and continued compliance to avoid (or minimise) the risk of being investigated.

Financial consequences

Compliance is mandatory on or before 10 December 2026, so we recommend ensuring compliance early to minimise the prospect of increased legal fees due to urgent work being required, or the risk of a penalty or investigation due to non-compliance by the deadline. Early compliance may also yield competitive benefits through enhanced consumer trust and reduced enforcement risk.

Compliance will entail:

  • one-off costs for system audits, legal advice, and privacy-policy redesign,
  • ongoing costs for governance, model documentation and training,
  • potential technology spend to implement explainability, logging and human-review mechanisms, and
  • indirect costs from delayed product releases or constrained data use.

Key compliance challenges and risk concentrations

Determining when an ADS output “significantly affects” rights or interests is fact-specific. Examples include:

  • Credit approvals
  • Insurance underwriting
  • Pricing
  • Employment screening
  • Utility disconnections
  • Personalised content curation
  • Dynamic pricing
  • Targeted advertising (or remarketing), or
  • Prioritisation of customer service queues.

The latter 3 examples are types that may be less obvious and would therefore require careful assessment.

Algorithmic complexity and commercial confidentiality

Businesses using advanced machine learning models may struggle to translate technical logic into summaries meaningful to the average reader without revealing proprietary information. The legislation demands disclosure of types of data and decisions, not source code. However, striking the right balance will require multidisciplinary collaboration and it is recommended you discuss the disclosure with an intellectual property lawyer before making it public. This will both ensure compliance with the APPs and minimise the risk of disclosing trade secrets.

Data provenance and quality

The reforms implicitly oblige entities to know and monitor the personal information variables entering their ADS. For organisations that source data from multiple vendors or ingest user-generated content (including from chatbots), maintaining data lineage and ensuring accuracy will be resource-intensive.

Accordingly, businesses will need to invest more time into:

  • considering what data they are collecting,
  • the purpose(s) for which it is collected, and
  • whether it is needed to be retained (and if so, for how long).

It may no longer be commercially appropriate to retain troves of data given the compliance obligations that may attach to it. It is recommended that you discuss the collection of data with a Privacy and Technology lawyer to determine any legal ramifications that may arise from the collation and storing of data by your business.

Third-party service providers

Where decision engines are supplied by vendors or hosted offshore, APP entities remain accountable for compliance. Accordingly, contracts will need robust clauses mandating transparency, data protection controls and timely information flow for policy updates.

This will also be of significance to entities the have multi-national operations.

A practical recommendation is to include questions relating to the use of automated decision-making platforms in your vendor and other trading partner due diligence, and to ensure that ADS appears as a specific line item on your audit and other checklists.

Strategic recommendations for compliance readiness

  1. Establish an ADS register and governance framework

Create a central inventory of all automated decision systems, capturing purpose, data inputs, decision logic outline, impact assessment and accountability owner. Integrate this register into privacy management plans reviewed at board level.

  1. Conduct Algorithmic Impact Assessments (AIAs)

For each system, evaluate whether outputs significantly affect individuals, identify potential biases, and document mitigation controls. Use AIAs to support the “informed” content of privacy policy disclosures.

  1. Revise privacy policies for clarity and specificity

Translate technical descriptions into plain language. Group decisions by category (e.g. “loan eligibility”, “pricing offers”, “fraud flags”) and specify data categories used (e.g. transaction history, location, behavioural data). Include processes for individuals to seek clarification or review. While no statutory “right to explanation” has yet been enacted in Australia, it is in place overseas, and may be introduced in the future. Accordingly, it is recommended that your compliance system is set up with this in mind.

  1. Strengthen contractual and supply-chain controls

Update procurement templates to require vendors to supply timely, accurate information about ADS operation, data usage and changes. Impose audit rights and incident notification obligations.

  1. Invest in explainability and record-keeping

Ensure models generate reproducible outputs, retain decision logs, and enable post-hoc explanations. This lowers litigation risk and facilitates regulator engagement.

  1. Train staff and embed a culture of transparency

Provide targeted training to data scientists, product managers and legal/compliance teams on the new obligations, the concept of “significant effect”, and the importance of maintaining accurate, consumer-facing disclosures.

Conclusion

Achieving sustained compliance will demand rigorous governance, technical transparency and cross-functional coordination, with the help of knowledgeable advisors. Organisations that act early to inventory their systems, assess impacts, renegotiate third-party arrangements and communicate clearly with customers will be rewarded. Not only will these organisations avoid regulatory sanction and substantial financial penalties, but their actions will strengthen consumer trust in their data-driven technologies.

While uncertainties remain around the threshold of “significant effect” and how much algorithmic detail regulators will expect, it’s recommended compliance is achieved well in advance of 10 December 2026 to future proof against the accelerating wave of digital regulatory reform. If you would like advice kick-starting your ADM compliance, get in touch with our privacy lawyers today.

 

The information contained in this article is general in nature and cannot be relied on as legal advice nor does it create an engagement. Please contact one of our lawyers listed above for advice about your specific situation.

stay up to date with our news & insights

 

Automated Decision Making: Business obligations, risks and strategic responses to new reforms

12 February 2026
Mark Metzeling

New privacy reforms are on the horizon in Australia, introducing additional obligations around Automated Decision-Making.

As discussed in an earlier insight, from 10 December 2026, any business that is an Australian Privacy Principle (APP) entity that uses personal information in automated, or semi-automated, decision-making, that could reasonably be expected to have a significant effect on an individual’s rights or interests, must expand its APP 1 privacy policy to describe the data it uses and the types of decisions it takes.

Failure to comply will expose organisations to the Privacy Act’s civil penalty regime, reputational damage and heightened regulatory scrutiny.

While we’ve already covered the current requirements and who will be covered by the incoming regime, our Privacy lawyers will now discuss in greater detail the likely operational and financial implications, key compliance challenges, and practical steps for achieving readiness by the 10 December 2026 commencement date.

Mandatory privacy-policy disclosure

As stipulated on the Office of the Australian Information Commissioner (OAIC) website, an APP entity is required to have a clearly expressed and up-to-date APP Privacy Policy.

The policy should, at a minimum:

  • easy to understand
  • easy to navigate, and
  • only include information that is relevant to the management of personal information by the entity about how it manages personal information.

From 10 December 2026, additional information must be provided. Every covered organisation must update its APP 1 privacy policy to set out, in plain language:

  • The kinds of personal information used in the operation of such automated decision-making platforms and computer programs.
  • The kinds of such decisions made solely by the operation of such computer programs.
  • The kinds of such decisions for which a thing, that is substantially and directly related to making the decision, is done by the operation of such computer programs.

Ongoing accuracy and currency

Because APP 1 requires policies to be kept up to date, businesses must establish processes to maintain accurate inventories of Automated Decision Systems (ADS) and the data they consume. Iterative model updates or new data sources will trigger policy changes. Accordingly, it’s advisable to update your privacy policy whenever:

  • there is an update of an existing ADS,
  • you commence use of a new ADS,
  • you utilise a new data source in an existing or new ADS, or
  • you use your ADS for new purposes or applications.

Given the rate at which this technology is moving, from a practical perspective, it is likely an update will be required to your privacy policy at least every 6 months.

Regulatory enforcement

Non-compliance is an “interference with privacy”, subject to OAIC investigation and civil penalties. Since December 2022, maximum penalties for serious or repeated interferences have been increased to the greater of:

  1. A$50 million,
  2. three times the benefit obtained, or
  3. 30 per cent of adjusted turnover.

It is noted that a failure to incorporate accurate ADS disclosures may readily be characterised as “serious” where automated decisions materially affect individuals. Accordingly, substantial penalties will apply if ADS disclosures aren’t made.

Furthermore, amendments to the Australian Consumer Law (ACL) will introduce a rebuttable presumption that harm caused by an autonomous product arose from a safety defect unless the supplier can prove otherwise. This flips the existing burden of proof and incentivises proactive safety engineering. This will be very important for businesses that sell products that utilise ADS in their operation, such as automobiles, robotics, remote access, etc.

Operational, legal and financial implications

Operational impacts

Compliance will have a fundamental impact on your operations, requiring changes and extra documentation. Below are some practical tips for businesses using ADM systems.

  1. Map all existing and planned ADS, including machine-learning classification, scoring, profiling and recommendation engines.
  2. Catalogue personal-information inputs, purpose, outputs and decision pathways;
  3. implement change-management to ensure privacy policy updates track model iteration.
  4. Coordinate legal, data-science, privacy and product teams to produce intelligible disclosures that balance clarity with protection of intellectual property and security.
  5. Embed governance to evaluate whether an automated decision is likely to have a “significant effect” on individuals, triggering disclosure.

Legal exposure

The OAIC is expected to prioritise enforcement because the reform squarely targets opaque algorithmic processing (which is also a priority target for the ACCC). If not compliant, businesses risk:

  • OAIC or ACCC investigations, enforceable undertakings and determinations,
  • civil penalty proceedings under Part VIB,
  • representative actions or individual claims for privacy interference, and
  • regulatory overlap with sectoral regimes (e.g. ASIC, APRA, ACCC) where ADS outcomes breach financial-services, consumer, or anti-discrimination laws.

It is important to note that it isn’t uncommon for an OAIC or ACCC investigation to cost in excess of A$1 million in legal fees, not to mention the time and resources it takes away from your business in complying with the same. Accordingly, we recommend advance and continued compliance to avoid (or minimise) the risk of being investigated.

Financial consequences

Compliance is mandatory on or before 10 December 2026, so we recommend ensuring compliance early to minimise the prospect of increased legal fees due to urgent work being required, or the risk of a penalty or investigation due to non-compliance by the deadline. Early compliance may also yield competitive benefits through enhanced consumer trust and reduced enforcement risk.

Compliance will entail:

  • one-off costs for system audits, legal advice, and privacy-policy redesign,
  • ongoing costs for governance, model documentation and training,
  • potential technology spend to implement explainability, logging and human-review mechanisms, and
  • indirect costs from delayed product releases or constrained data use.

Key compliance challenges and risk concentrations

Determining when an ADS output “significantly affects” rights or interests is fact-specific. Examples include:

  • Credit approvals
  • Insurance underwriting
  • Pricing
  • Employment screening
  • Utility disconnections
  • Personalised content curation
  • Dynamic pricing
  • Targeted advertising (or remarketing), or
  • Prioritisation of customer service queues.

The latter 3 examples are types that may be less obvious and would therefore require careful assessment.

Algorithmic complexity and commercial confidentiality

Businesses using advanced machine learning models may struggle to translate technical logic into summaries meaningful to the average reader without revealing proprietary information. The legislation demands disclosure of types of data and decisions, not source code. However, striking the right balance will require multidisciplinary collaboration and it is recommended you discuss the disclosure with an intellectual property lawyer before making it public. This will both ensure compliance with the APPs and minimise the risk of disclosing trade secrets.

Data provenance and quality

The reforms implicitly oblige entities to know and monitor the personal information variables entering their ADS. For organisations that source data from multiple vendors or ingest user-generated content (including from chatbots), maintaining data lineage and ensuring accuracy will be resource-intensive.

Accordingly, businesses will need to invest more time into:

  • considering what data they are collecting,
  • the purpose(s) for which it is collected, and
  • whether it is needed to be retained (and if so, for how long).

It may no longer be commercially appropriate to retain troves of data given the compliance obligations that may attach to it. It is recommended that you discuss the collection of data with a Privacy and Technology lawyer to determine any legal ramifications that may arise from the collation and storing of data by your business.

Third-party service providers

Where decision engines are supplied by vendors or hosted offshore, APP entities remain accountable for compliance. Accordingly, contracts will need robust clauses mandating transparency, data protection controls and timely information flow for policy updates.

This will also be of significance to entities the have multi-national operations.

A practical recommendation is to include questions relating to the use of automated decision-making platforms in your vendor and other trading partner due diligence, and to ensure that ADS appears as a specific line item on your audit and other checklists.

Strategic recommendations for compliance readiness

  1. Establish an ADS register and governance framework

Create a central inventory of all automated decision systems, capturing purpose, data inputs, decision logic outline, impact assessment and accountability owner. Integrate this register into privacy management plans reviewed at board level.

  1. Conduct Algorithmic Impact Assessments (AIAs)

For each system, evaluate whether outputs significantly affect individuals, identify potential biases, and document mitigation controls. Use AIAs to support the “informed” content of privacy policy disclosures.

  1. Revise privacy policies for clarity and specificity

Translate technical descriptions into plain language. Group decisions by category (e.g. “loan eligibility”, “pricing offers”, “fraud flags”) and specify data categories used (e.g. transaction history, location, behavioural data). Include processes for individuals to seek clarification or review. While no statutory “right to explanation” has yet been enacted in Australia, it is in place overseas, and may be introduced in the future. Accordingly, it is recommended that your compliance system is set up with this in mind.

  1. Strengthen contractual and supply-chain controls

Update procurement templates to require vendors to supply timely, accurate information about ADS operation, data usage and changes. Impose audit rights and incident notification obligations.

  1. Invest in explainability and record-keeping

Ensure models generate reproducible outputs, retain decision logs, and enable post-hoc explanations. This lowers litigation risk and facilitates regulator engagement.

  1. Train staff and embed a culture of transparency

Provide targeted training to data scientists, product managers and legal/compliance teams on the new obligations, the concept of “significant effect”, and the importance of maintaining accurate, consumer-facing disclosures.

Conclusion

Achieving sustained compliance will demand rigorous governance, technical transparency and cross-functional coordination, with the help of knowledgeable advisors. Organisations that act early to inventory their systems, assess impacts, renegotiate third-party arrangements and communicate clearly with customers will be rewarded. Not only will these organisations avoid regulatory sanction and substantial financial penalties, but their actions will strengthen consumer trust in their data-driven technologies.

While uncertainties remain around the threshold of “significant effect” and how much algorithmic detail regulators will expect, it’s recommended compliance is achieved well in advance of 10 December 2026 to future proof against the accelerating wave of digital regulatory reform. If you would like advice kick-starting your ADM compliance, get in touch with our privacy lawyers today.