Ahead of its September 8 board meeting, the California Privacy Protection Agency (CPPA) has issued draft regulations on cybersecurity audits and risk assessments.  Public comments will be requested once the formal rulemaking process is kicked off.  Accordingly, the draft regulations are subject to change.  Below are the key takeaways:

Cybersecurity Audits

  • New cybersecurity audit requirement.  Certain categories of businesses would be required to perform cybersecurity audits.  The Board will consider several options for thresholds that a business must meet in order to be subject to the requirement, such as the number of customers for whom the business has processed personal information in the past year and whether the business reached a certain annual gross revenue.
  • Timing.  A business subject to the audit requirement would have 24 months from when the rules go into effect to complete its first audit and would be required to complete an audit annually thereafter.
  • Scope.  The CPPA will consider scoping the audit requirement based on a number of factors, such as:
    • economic, physical, and psychological harms associated with unauthorized activity around personal information;
    • any risks from cybersecurity threats or incidents that have or could materially affect consumers.
  • Reporting.  A businesses subject to the audit requirement would be required to submit an annual notice of compliance to the CPPA, including written certifications that the business either did or did not comply with its requirements.

Risk Assessments

  • New definitions.  “Artificial intelligence” and “Automated Decisionmaking Technology” are defined broadly.
  • New risk assessment requirement.  A business whose processing activities would present significant risk to consumers would be required to conduct a risk assessment before processing.
  • Timing.  Risk assessments may be required annually, biannually, or once every three years.  In addition, a business would need to update its risk assessment after a material change in processing activity such as changes to the processing purpose, the degree of human involvement, or the logic of the ADMT.
  • Additional requirements for businesses using ADMT.  A business that uses ADMT would need to explain why ADMT was used to achieve a particular purpose and how the business plans to use outputs secured from ADMT.
  • Additional requirements for businesses using personal information to train AI and ADMT.  A business that processes personal information to train AI and ADMT to be available to other persons or businesses would be required to explain the purposes for which the AI or ADMT may be used and any safeguards the business has put in place.
  • Reporting.  Businesses would need to certify their risk assessments with the CPPA annually.
Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection…

Lindsey Tonsager co-chairs the firm’s global Data Privacy and Cybersecurity practice. She advises clients in their strategic and proactive engagement with the Federal Trade Commission, the U.S. Congress, the California Privacy Protection Agency, and state attorneys general on proposed changes to data protection laws, and regularly represents clients in responding to investigations and enforcement actions involving their privacy and information security practices.

Lindsey’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of artificial intelligence, data processing for connected devices, biometrics, online advertising, endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, e-mail marketing, disclosures of video viewing information, and new technologies.

Lindsey also assesses privacy and data security risks in complex corporate transactions where personal data is a critical asset or data processing risks are otherwise material. In light of a dynamic regulatory environment where new state, federal, and international data protection laws are always on the horizon and enforcement priorities are shifting, she focuses on designing risk-based, global privacy programs for clients that can keep pace with evolving legal requirements and efficiently leverage the clients’ existing privacy policies and practices. She conducts data protection assessments to benchmark against legal requirements and industry trends and proposes practical risk mitigation measures.

Photo of Jemie Fofanah Jemie Fofanah

Jemie Fofanah is an associate in the firm’s Washington, DC office. She is a member of the Privacy and Cybersecurity Practice Group and the Technology and Communication Regulatory Practice Group. She also maintains an active pro bono practice with a focus on criminal…

Jemie Fofanah is an associate in the firm’s Washington, DC office. She is a member of the Privacy and Cybersecurity Practice Group and the Technology and Communication Regulatory Practice Group. She also maintains an active pro bono practice with a focus on criminal defense and family law.