On October 10, 2022 the draft rules implementing the Colorado Privacy Act (“CPA”) were officially published in the Colorado Register.  Written comments on the draft rules are due by November 7, 2022.  The CPA draft rules share some similarities with the draft rules set forth by the California Privacy Protection Agency (“CPPA”) interpreting the California Privacy Rights Act (“CPRA”).  Both sets of draft rules address requirements for privacy policy disclosures, consumer rights requests, and providing opt-out mechanisms.  However, there are a number of key differences between the two drafts. We highlight some of these below.

TopicCPRA Draft RulesCPA Draft Rules
Universal Opt-outThe draft CPRA rules would require businesses to honor opt-out preference signals and prescribe detailed requirements for responding to these requests.  § 7025.  However, the rules do not indicate which specific signals must be recognized or a process for officially recognizing any particular signals.The Attorney General would maintain a list of pre-approved universal opt-out mechanisms, each of which controllers would need to honor.   The draft regulations re-iterate that to be a recognized mechanism, it must comply with all statutory criteria, including that the mechanism must not prevent the controller’s ability to determine whether the consumer is a Colorado resident.  In addition, the AG contemplates that a universal opt-out mechanism could allow consumers to opt out of “all purposes” or “specific purposes” (i.e., data sales or targeted advertising).  A business can override the universal opt-out if a consumer consents to that business’s processing of the data.
Dark PatternsThe draft rules require that there be “symmetry in choice” (including both yes and no buttons and prohibiting bundled consent) and that the number of steps or clicks to opt out not exceed the number to opt in.  § 7004.The draft rules provide numerous product design examples that would constitute impermissible dark patterns, some of which are commonly utilized business practices.  For example, because the size, font, and styling of two choices must be “equal” and “symmetrical,” it is a dark pattern to present “an “I accept” button in a larger size than the “I do not accept” button” or to have the “I do not accept button” in a greyed-out color while the “I accept” button is in a bright color.    
Opt-In Consent for Incompatible UsesThe draft regulations require businesses to obtain the consumer’s explicit consent before processing personal information for purposes that are unrelated to or incompatible with the purposes for which the personal information was collected or processed.Instead of adopting the FTC’s standard for making material prospective or retroactive changes to privacy notices, the draft regulations would require consent to process personal data where not reasonably necessary to or compatible with the purposes specified in the privacy notice based on seven criteria specified in the regulations.
ProfilingThe statutory text requires rulemaking on automated decisionmaking, but the latest draft of the CPRA draft regulations do not address profiling requirements.The profiling opt out would apply only where automated decisions produce legal or other similarly significant effects and where a human reviewer cannot change or influence the decision based on meaningful review of the available data.      The rules also specify the level of transparency that must be provided in privacy notices with respect to profiling and automated decisions, including for example, decisions that are subject to profiling, categories of data used for profiling, a plain language explanation of the logic used in profiling, if the system has been evaluated for accuracy, fairness, and bias.   The rules contain prescriptive requirements for conducting data protection assessments related to profiling, including for example, if profiling is conducted by third-party software, the name of the software and copies of any internal or external evaluations of the software’s accuracy and reliability.  
Loyalty ProgramsThe draft rules provide an example of how the financial incentive provisions apply to loyalty programs.Prescriptive disclosures would be required for loyalty programs.  The draft rules also specify how the operation of the loyalty program should change if a consumer exercises any of their CPA rights.   
AssessmentsThe latest draft regulations do not address the statute’s cyber and risk assessments; the CPPA has indicated these rules will come later.The draft rules contain expansive and prescriptive requirements for what must be included in risk assessments.  However, businesses can utilize assessments prepared for other jurisdictions as long as they are “reasonably similar in scope and effect.”     
Sensitive Data InferencesThe draft CPRA rules focus on implementing the right to limit the use and disclosure of sensitive personal information, which is limited to sensitive personal information that is collected or processed for the purposes of creating inferences about a consumer.The draft CPA rules define “sensitive data inferences,” then provide the conditions in which controllers may process sensitive data inferences without consent.
Trade Secret ProtectionProtections for trade secrets in the statute, which the draft regulations cannot alter. The law similarly protects trade secrets.  However, the draft rules notably require controllers to disclose personal data or sensitive data inferences created using a trade secret algorithm without disclosing the algorithm itself.  Rule 4.07(B)(1).
Privacy Disclosures and Consumer RightsThe rules prescribe formatting and content requirements for notices, mechanisms for submitting consumer rights requests, and how businesses should respond to such requests.Like California, the draft regulations include extensive formatting and content requirements for privacy notices, and specify how businesses should permit consumers to exercise their rights as well as how businesses should respond to consumer rights requests.
Photo of Lindsey Tonsager Lindsey Tonsager

Lindsey Tonsager helps national and multinational clients in a broad range of industries anticipate and effectively evaluate legal and reputational risks under federal and state data privacy and communications laws.

In addition to assisting clients engage strategically with the Federal Trade Commission, the…

Lindsey Tonsager helps national and multinational clients in a broad range of industries anticipate and effectively evaluate legal and reputational risks under federal and state data privacy and communications laws.

In addition to assisting clients engage strategically with the Federal Trade Commission, the U.S. Congress, and other federal and state regulators on a proactive basis, she has experience helping clients respond to informal investigations and enforcement actions, including by self-regulatory bodies such as the Digital Advertising Alliance and Children’s Advertising Review Unit.

Ms. Tonsager’s practice focuses on helping clients launch new products and services that implicate the laws governing the use of endorsements and testimonials in advertising and social media, the collection of personal information from children and students online, behavioral advertising, e-mail marketing, artificial intelligence the processing of “big data” in the Internet of Things, spectrum policy, online accessibility, compulsory copyright licensing, telecommunications and new technologies.

Ms. Tonsager also conducts privacy and data security diligence in complex corporate transactions and negotiates agreements with third-party service providers to ensure that robust protections are in place to avoid unauthorized access, use, or disclosure of customer data and other types of confidential information. She regularly assists clients in developing clear privacy disclosures and policies―including website and mobile app disclosures, terms of use, and internal social media and privacy-by-design programs.

Photo of Jayne Ponder Jayne Ponder

Jayne Ponder is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity Practice Group. Jayne’s practice focuses on a broad range of privacy, data security, and technology issues. She provides ongoing privacy and data protection…

Jayne Ponder is an associate in the firm’s Washington, DC office and a member of the Data Privacy and Cybersecurity Practice Group. Jayne’s practice focuses on a broad range of privacy, data security, and technology issues. She provides ongoing privacy and data protection counsel to companies, including on topics related to privacy policies and data practices, the California Consumer Privacy Act, and cyber and data security incident response and preparedness.