- with Senior Company Executives, HR and Finance and Tax Executives
- with readers working within the Business & Consumer Services, Insurance and Technology industries
California is moving decisively from transparency-based privacy regulation to substantive governance of automated decision making. The new Automated Decision-Making Technology (ADMT) regulations issued by the California Privacy Protection Agency (CPPA) materially expand consumer rights and impose affirmative compliance, documentation, and risk-assessment obligations on businesses that use AI-driven or automated systems to make consequential decisions about individuals.
For companies operating at scale, these rules require operational changes, new disclosures, internal review processes, and defensible documentation well before the January 1, 2027 effective date.
Scope and Applicability: Who Is Covered
The ADMT regulations are part of the broader California Consumer Privacy Act (CCPA) framework, as amended by the California Privacy Rights Act (CPRA). They generally apply to for-profit businesses doing business in California that meet any of the following thresholds:
- Annual gross revenue exceeding US $25 million
- Buying, selling, or sharing personal information of 100,000 or more California consumers or households
- Deriving 50 percent or more of annual revenue from selling or sharing personal information
The regulations define ADMT broadly as any technology that processes personal information using computation to replace or substantially replace human decision making. This includes many AI- and machine-learning-driven tools used in employment, lending, healthcare, education, and similar contexts.
The rules exclude basic infrastructure technologies, such as data storage, networking, firewalls, spellchecking tools, and spreadsheets, so long as they do not replace human decision making. The regulatory focus is not on automation per se but on automation that meaningfully determines outcomes for individuals.
Front-End Obligations
The compliance burden intensifies when ADMT is used to make or materially inform significant decisions. These include decisions affecting:
- Financial or lending services
- Housing
- Education enrollment or opportunities
- Employment or independent contracting
- Compensation
- Healthcare services
Notably, targeted advertising alone is excluded.
Before using ADMT to process personal information, businesses must provide consumers with a clear and conspicuous notice at or before the point of data collection. This pre-use notice must explain: (i) the specific purpose for using ADMT; (ii) the consumer's right to opt-out of or appeal to a human reviewer; (iii) the right to access information about the ADMT; (iv) how the ADMT processes personal information, including relevant data categories; (v) the extent to which outputs drive significant decisions and available alternatives if the consumer opts out.
Opt-Out and Appeal Rights
Businesses must offer at least two methods for consumers to opt out of the use of ADMT, such as online forms, phone numbers, email, or mail-based requests. There are limited exceptions, most notably where a consumer already has a right to human appeal, or ADMT is used solely for work or educational assessment, allocation, or compensation without unlawful discrimination.
Consumers may request detailed information about how ADMT was used in relation to them, including:
- The purpose of the ADMT
- How personal information was processed
- The role the ADMT output played in the final decision
- Whether and how human review was involved
Businesses must:
- Acknowledge requests within 10 business days
- Substantively respond within 45 days, with a possible 45-day extension
- Maintain verification procedures and document denials where identity cannot be confirmed
Retaliation against consumers exercising these rights is expressly prohibited.
The Core Compliance Obligation
The most consequential obligation under the ADMT regulations is the requirement to conduct and maintain formal risk assessments.1 A business must conduct a risk assessment if it uses ADMT to make or meaningfully inform a significant decision about a consumer or processes personal information with the intent to train ADMT systems, even if the system is not yet deployed.
In addition, a risk assessment is required whenever a business uses automated processing for purposes of profiling, reflecting the CPPA's concern with systems that evaluate, predict, or categorize individuals in ways that may affect their rights or opportunities.
The purpose of the risk assessment is to evaluate whether the risks to a consumer's privacy from processing personal information outweigh the benefits to the business. The risk assessment identifies the risks to a consumer's data and requires businesses to manage those risks or determine if the risk outweighs the benefits to the business. The risk assessment must be documented in a written report and address various factors that include:
- The business purpose for processing personal information using ADMT
- Categories of personal information involved
- Methods of collection, use, disclosure, retention, and processing
- Disclosures provided to consumers regarding the processing of personal information
- Third parties (service providers, contractors, vendors) with access to the data
- Logic used by the ADMT, including assumptions or limitations, how outputs are generated, how outputs are used in practice to make or inform significant decisions
- Data privacy concerns including unauthorized access or disclosure of personal information as well as economic, physical, reputational, and psychological impacts
- Cybersecurity safeguards, such as encryption
- Policies, procedures, and training to ensure the ADMT works for the stated purpose and does not unlawfully discriminate
- Identification of contributors, reviewers, and approvers of the assessment
Risk assessments must be conducted at least once every three years. However, the regulations also require updates within 45 calendar days of any material change to the processing activity that could create new or increased negative impacts to consumers' privacy. Businesses must retain all original and updated versions of each risk assessment for as long as the processing continues, or five years, whichever is longer.
For risk assessments conducted in 2026 and 2027, the business must submit a copy of its risk assessment to the CPPA no later than April 1, 2028. For assessments conducted after 2027, submissions must be made no later than April 1, 2029. Each risk assessment must be accompanied by an attestation that the information is true and correct, along with the name and business title of the individual. The California Attorney General may require a business to submit its risk assessment at any time.
Conclusion
California's ADMT regulations signal a shift toward direct oversight of automated decision-making and place new emphasis on governance, documentation, and accountability. Businesses that rely on AI-driven tools should use the time before the January 1, 2027 effective date to assess where automated decisions are occurring, how those systems are governed, and whether existing processes can withstand regulatory scrutiny. Early, disciplined preparation will be critical to managing both compliance risk and potential enforcement exposure.
Footnote
1. The CPPA Regulations define “profiling” as: Any form of automated processing of personal information to evaluate certain personal aspects (including intelligence, ability, aptitude, predispositions) relating to a natural person and in particular to analyze or predict aspects concerning that natural person's performance at work, economic situation, health (including mental health), personal preferences, interests, reliability, predispositions, behavior, location, or movements.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]