Massachusetts Voters Approve Measure for Expanded Access to Vehicle DataIn a roller coaster of an election week, it was easy for smaller ballot measures to become overshadowed. One ballot measure that you may have missed is Massachusetts’s Ballot Question 1 regarding the “right to repair” motor vehicles. The ballot measure expands access to a driver’s motor vehicle data. Vehicles are increasingly becoming more computerized due to the use of “telematics systems.” Telematic systems are systems that collect and wirelessly transmit mechanical data to a remote server. The types of mechanical data captured can include location, speed, braking, fuel consumption, vehicle errors, and more. Ballot Question 1 is a measure to expand access to such mechanical data.

Beginning with model year 2022, manufacturers of motor vehicles sold in Massachusetts that use telematics systems must equip the vehicles with a standardized open access data platform. This will allow owners and independent vehicle repair facilities to access the data. Owners will be able to easily view mechanical data on their cellphones. Ballot Question 1 will also give owners more options for car repair services. Local vehicle repair shops will now have access to data that is usually restricted to dealers.

Opponents of Ballot Question 1 are concerned about the widened net of access to a driver’s data, especially location data. There are also concerns about the increased risk for security breaches. Privacy advocates have long voiced concerns about large-scale collection of location data. In fact, the New York Times Privacy Project has published a series of articles about the dangers of unregulated collection of location data.

With the influx of cars with telematic systems hitting the market, ballot measures surrounding access to vehicle data will likely increase in the next few years. However, with counter-veiling privacy pushes to regulate or limit the use of location data, we may also see a tug of war between various laws that seek to provide access to location data and those who seek to regulate the collection and use of that information.

Continue to look for further updates and alerts from Bradley on state privacy rights and obligations.

No Unreasonable Searches or Seizures of Electronic Data in MichiganThe most intimate information can be found in the data on our cellphones and laptops, from geo-location data to search history. The level of privacy protections afforded to electronic data and communications have been unclear and ambiguous for years, but after this election, Michigan now has some clarity.

On November 03, 2020, Proposal 2 was passed by an overwhelming majority, amending Article 1, Section 11 of Michigan’s Constitution. Proposal 2 will provide personal protections by prohibiting unreasonable searches or seizures of electronic data and communications. It will do so through requiring law enforcement to obtain a search warrant to access an individual’s electronic data and communication under the same conditions needed to search through an individual’s home or seize an individual’s papers or physical items.

The amendment will fill a gap in the Michigan Constitution. Before the passing of Proposal 2, the Michigan Constitution provided protections against unreasonable searches and seizures for physical items but did not explicitly mention electronic data or communications. With the amendment to the Michigan Constitution, there will be no more ambiguity over whether data — such as cell phone data, communications with Alexa and Google Home, or Apple Watch health metrics — is private and requires a warrant to be searched or seized by law enforcement in Michigan.

Michigan is not the first state (and likely won’t be the last) to provide some level of protection for electronic data and communications. Utah has enacted a law banning warrantless searches and seizures of electronic data and communications. Moreover, the Florida Constitution mentions “the right of the people to be secure … against the unreasonable interception of private communications.” However, both Utah and Florida’s protections are not as strong as Michigan’s explicit protections in a constitutional amendment. Going forward, it is likely that more states will follow Michigan’s lead in protecting electronic data and communications through constitutional amendments.

Continue to look for further updates and alerts from Bradley on state privacy rights and obligations.

New “Basic Assessment” Is a Bridge to CMMC for Defense ContractorsThe Department of Defense (DoD) continues to enhance cybersecurity requirements in its supply chain. A new rule requires some contractors to assign a numerical score to their current cybersecurity practices. Additionally, the rule begins rolling out requirements for all defense contractors to have their cybersecurity certified by a third party.

For years, the gold standard for defense contractors has been NIST SP 800-171 (the NIST Standard). The NIST Standard establishes cybersecurity practices for companies that handle DoD “controlled unclassified information” (CUI). Historically, the NIST standard was largely aspirational, and contractors have been allowed to self-certify that they either comply or have a plan to comply in the future. That looseness led to varying degrees of — and endlessly delayed plans for — compliance.

To address those shortcomings, the Cybersecurity Maturity Model Certification (CMMC) Framework will end the self-certification option. Instead, contractors will need certification from a CMMC Third-Party Assessment Organization (C3PAO). C3PAOs must themselves be accredited by an Accreditation Body. The CMMC Framework is being rolled out over the next five years, starting November 30, 2020. But no C3PAOs have yet been accredited, so it will be a while before contractors can be CMMC certified.

In the meantime, as a bridge to CMMC, the rule establishes a more robust assessment framework for the NIST Standard. Rather than self-certify compliance, contractors must specifically score their practices according to a detailed list of controls, on a scale from -‍203 to +‍110. This “Basic Assessment” score will be posted to the Supplier Performance Risk System (SPRS), where it can affect procurement decisions. That gives contractors additional incentive to comply with the NIST Standard sooner, rather than later. Because of substantial overlap between the NIST Standard and the CMMC controls, the scoring could smooth the eventual transition to the CMMC Framework. As Bradley has reported, this Basic Assessment is due from covered defense contractors by November 30, 2020.

NIST SP 800-171 DoD Assessment Methodology

DoD contractors are already familiar with the Defense Federal Acquisition Regulation Supplement (DFARS) cybersecurity clause 252.204-7012. It is included in nearly all DFARS-covered contracts and requires that contractors’ cybersecurity meet the NIST Standard. Historically, the DoD had no way to verify a contractor’s implementation of the standard. The new interim rule creates a more specific, standardized self-assessment methodology.

Under this new NIST SP 800-171 DoD Assessment methodology, contractors still self-assess their compliance. What’s new is the standardized, uniform methodology to be used for assessment, in which a contractor scores itself on a scale from -203 to +110, based on the controls with which it complies. In addition to the basic assessment — which is a self-assessment — after award, the government may in some cases conduct its own medium or high assessment of a contractor’s cybersecurity. Assessments generally expire after three years.

The basic assessment applies across the board to all defense contractors who handle DoD CUI.

CMMC Framework

The industry has been preparing for CMMC since last year, so its entry into the DFARS comes as no surprise. As discussed above, no C3PAOs have yet been accredited. Nonetheless, efforts to comply with the CMMC standards will not be wasted.

CMMC has five levels of compliance, and the required compliance level will be defined in each contract based on the associated risks. Every contractor will need to meet CMMC Level 1. And any contractor handling CUI will probably need to meet at least CMMC Level 3. The Level 3 requirements remain very similar to the NIST Standard. For many contractors, then, their CMMC efforts will help improve their Basic Assessment score.

Risks of Noncompliance

Though the Basic Assessment is a self-assessment, defense contractors should score themselves honestly. In the Eastern District of California, claims are proceeding against Aerojet RocketDyne under the False Claims Act (FCA) for its allegedly false representations of cyber compliance. In allowing the suit to move forward, the court recognized the possibility that “the government never expected full compliance.” However, the specific “extent to which a company was technically compl[ia]nt still mattered.” Earlier the Ninth Circuit had allowed similar FCA claims to proceed against Raytheon in another closely watched case. Those claims have now been finally dismissed. But both the Raytheon and Aerojet RocketDyne cases gesture toward the importance of honest and accurate assessments of cyber compliance.


The DoD recognizes the global importance of strong cybersecurity. This new rule shows that the government is tightening not only the security requirements. Indeed, the assessment framework and reporting requirements are equally important to ensuring robust controls and compliance. But many defense contractors — especially small businesses — may chafe under some of the new requirements. Assessing, conforming, and certifying its systems and procedures could cost even a relatively small contractor many tens of thousands of dollars. Bradley will continue to report on new developments.

If you have any questions about the topics discussed in this article or any related issues, please feel free to contact Aron Beezley, David Lucas, or Andrew Tuggle.

2020 Brings Times of Change: Key Privacy Law Updates This YearThe privacy law landscape is constantly changing, and it can feel like a daunting task for businesses to keep up with the laws of 50 states in the U.S. plus any international laws that also may be applicable. 2020 seems to be a banner year for change on many fronts. COVID-19 and the 2020 elections have caused profound changes this year, but for those who are affected by changing privacy laws, this has been a remarkable year of change as well.

For example, the California Consumer Privacy Act (CCPA) went into effect on January 1, 2020; the final regulations under CCPA were approved by the California Office of Administrative Law in August of 2020; and shortly thereafter, the November elections brought additional change with the passage of the California Privacy Rights Act (CPRA). CPRA does not go into effect until January 1, 2023, however, it does have a one-year lookback, which means that companies will need to be largely in compliance by January 1, 2022. Additionally, anyone who has implemented CCPA or GDPR, will attest to how quickly two years can fly by when attempting to understand the multitude of changes imposed by a comprehensive privacy law like CPRA. The culmination of new requirements and broad scope of CPRA will need to be understood and implemented into privacy policies and procedures going forward in an effort to ensure compliance on January 1, 2023.

However, the CCPA/CPRA changes are only one example of the consumer data privacy legislation changes this year. According to the National Conference of State Legislatures, in 2020, bills relating to consumer data privacy legislation were considered in at least 30 states and in Puerto Rico (see NCSL 2020 Consumer Data Privacy Legislation). Though most of these bills were not passed, the fact that these bills were considered is an indicator of the interest in protection of consumer data and seems to foreshadow an increase in privacy regulation in the future.

From an international perspective, 2020 also brought the invalidation of the EU – U.S. Privacy Shield framework by Schrems II, which caused many businesses to have to rethink their approach to transfers of personal data between the European Union or United Kingdom and the U.S. (see Schrems II, Part 2 – Additional Guidance for the Transfer of Personal Data Between the EU and U.S.). Schrems II did not invalidate the use of Standard Contractual Clauses (SCCs) for transfer of data but it did call into question whether the SCCs are adequate to address the risks associated with data transfers to a non-EU country. The data exporter may need to apply supplementary measures, in addition to SCCs, if needed to protect the personal data when transferred. Supplemental measures can include encryption, anonymization, and pseudonymization, as well as other tools. Schrems II requires that businesses analyze the protections currently in place for data transfers between the EU or the UK and the U.S. to ensure compliance.

Awareness of these changes and implementing privacy policies and practices that protect your business are key during these changing times. Continue to rely on Bradley to keep you up to date on privacy rights and obligations.

Privacy at the Polls: Portland, Maine Votes to Ban Facial Recognition TechnologyWhile the nation waits for the results of the presidential race to be tallied, across the country local and statewide referendums on privacy issues have been decided. In Portland, Maine voters approved a ballot measure to ban the use of facial recognition technology by local police and city agencies. Portland joins other cities such as Boston, San Francisco, and (the other) Portland, Oregon that have already banned the use of this technology.

Facial recognition is becoming an increasingly common method of identifying or verifying identity of an individual using their facial features. Facial recognition software is often particularly bad at recognizing minorities, women, and younger people, often misidentifying them, which can disparately impact certain groups —oftentimes in serious ways when this technology is used by law enforcement or government agencies.

Portland, Maine’s ballot initiative added teeth to an ordinance passed by the Portland city council in August 2020. That ordinance included the ban on facial recognition technology but did not include any enforcement measures. The recently passed ballot initiative includes measures allowing citizens to sue the city for violations, requiring the city to suppress evidence illegally obtained through the use of this technology, and making violations of the ordinance by city employees grounds for suspension or termination. The initiative additionally provides for up to $1,000 in penalties for violations, plus attorneys’ fees.

With the rise of facial recognition technology, many advocates have warned of the potential privacy and abuse implications, especially when such technology is employed by law enforcement or state agencies. Portland, Maine’s vote to approve this ban may be a signal of what’s to come in other cities across the country.

Continue to look for further updates and alerts from Bradley on state privacy rights and obligations.

Introducing… the Global Privacy ControlOne of the most reoccurring questions we’ve gotten from companies subject to CCPA that have a “Do Not Sell” link has been “What the heck do we do about this global privacy control?” Up until now, there wasn’t a clear, or even semi-helpful, answer to that question that didn’t involve a fair amount of guesswork. We now have our answer — the aptly named “global privacy control” — but what exactly does it mean?

This concept of “user-enabled global privacy controls” was introduced in the CCPA regulations and left companies scratching their heads as to what it meant. Specifically, Section 999.315(c) states:

If a business collects personal information from consumers online, the business shall treat user-enabled global privacy controls, such as a browser plug-in or privacy setting, device setting, or other mechanism, that communicate or signal the consumer’s choice to opt-out of the sale of their personal information as a valid request submitted pursuant to Civil Code section 1798.120 for that browser or device, or, if known, for the consumer.

The use of the word “shall” coupled with the seemingly unascertainable scope of this provision understandably got the attention of those tasked with CCPA compliance. Based on a literal reading, a business has to somehow monitor for the development of any type of mechanism that might provide an opt-out and recognize it or risk being considered non-compliant with CCPA. One caveat, subsection (1), provided that “[a]ny privacy control developed in accordance with these regulations shall clearly communicate or signal that a consumer intends to opt-out of the sale of personal information.” So, businesses only have to monitor for every possible mechanism that “clearly” communicates or signals an intention. This was an added revision made to the original draft regulations, so presumably the regulators see this as a meaningful limitation. Nevertheless, there remains no apparent limitation on a business’ obligation to proactively monitor for highly technical implementations that a business may have no internal capability to address, even if it identifies such a global privacy control. For those wrestling with this dilemma there was a temporary measure of comfort. Specifically, in the Final Statement of Reasons for the CCPA Regulations, the OAG stated that the subsection cited above “is forward-looking and intended to encourage innovation and the development of technological solutions to facilitate and govern the submission of requests to opt-out” (see FSOR at p. 37). So we knew, at the very least, the OAG had no signals in mind at the time and businesses were not expected to be processing any.

Unfortunately, it would appear that the window of comfort is coming to a close. A number of organizations, including the likes of DuckDuckGo, the Electronic Freedom Frontiers, Mozilla, the NY Times and the Washington Post, are implementing the aptly named “global privacy control” (GPC) specification. This specification explicitly references this provision of the CCPA regulations stating “[t]he GPC signal will be intended to communicate a Do Not Sell request from a global privacy control, as per CCPA-REGULATIONS §999.315.” Given the express intent and the industry players involved, it would appear that this is the first foray into the user-enabled global privacy control. Businesses that have a “Do Not Sell” link should take note and begin to determine how they can comply.

Even though this one has cornered the market on the name, it is highly doubtful this will be the last user-enabled control to signal a user’s intent to opt-out, so businesses need to dedicate resources to addressing this evolving issue.

Governor Approves CCPA Amendment to Further Except Healthcare and Research InformationGov. Gavin Newsom recently approved A.B. 713, a bill that creates further CCPA exceptions for healthcare and research information. The bill is especially potent in the COVID-19 era where the need for medical research is greater than ever.

A.B. 713 presents a few notable changes from prior versions of the CCPA. First, the amendment expands the prior exemption for clinical trials to now include information that is collected, used, or disclosed in “research.” Research is broadly defined in Section 164.501 of HIPAA as “a systematic investigation, including research development, testing, and evaluation, designed to develop or contribute to generalizable knowledge.”

Second, the amendment expressly exempts information that is deidentified pursuant to either the expert determination method or safe harbor method provided for in Section 164.514 of HIPAA. It is also a requirement that the information is “collected, created, transmitted, or maintained by an entity regulated by the Health Insurance Portability and Accountability Act, the Confidentiality Of Medical Information Act, or the Federal Policy for the Protection of Human Subjects, also known as the Common Rule.” Furthermore, an entity that sells or discloses deidentified patient information must disclose in its online privacy policy which method was used to deidentify the information.

Third, the amendment makes clear that information that is “reidentified shall no longer be eligible for the exemption” except under the following circumstances:

  • Treatment, payment, or healthcare operations conducted by a covered entity or business association acting in accordance with HIPAA;
  • Research, as defined in Section 164.501 of HIPAA, that is consistent with the Common Rule;
  • Public health activities as described in Section 164.512 of HIPAA;
  • Pursuant to contract; or
  • If otherwise required by law.

Finally, the amendment provides that beginning January 1, 2021, any contract for the sale or license of deidentified information must include language that (1) the information being sold or licensed includes deidentified information; (2) a statement that reidentification is prohibited; and (3) a statement that the purchaser or licensee may not further disclose the deidentified information to a third party unless the third party is contractually bound by the same or stricter restrictions and conditions.

In a time of unprecedented changes, expect to see additional developments in state privacy laws —especially privacy laws that concern healthcare. Stay informed as we continue to monitor those developments.

Threats, Harassment, and Contact Tracing: Why Privacy Programs are Expanding to Protect Health Care WorkersBack in March we wrote about Address Confidentiality Programs (ACPs) as the “high stakes compliance risk you probably haven’t heard of.” These state-sponsored programs were traditionally designed to protect victims of crimes such as domestic abuse, sexual assault, stalking, or human trafficking from perpetrators who seek to find and harm their victims. Since that first post a lot has changed, namely COVID-19, the nation’s divisive stance on masks, and attitudes toward public health officials.

Nationwide, at least 61 state or local health leaders in 27 states have resigned, retired, or been fired since April 2020, according to U.S. News. The same report noted that 13 of those departures were in California, including 11 county health officials and California’s top two public health officials. A contributing factor to this exodus of leaders is that health officials are facing an increasing amount of harassment, and even threats of violence, for their work on issues such as contact tracing and COVID-19 containment strategies.

It is in the midst of this volatile healthcare environment that, on Wednesday, September 23, 2020, California expanded its ACP to include not just victims of domestic abuse, but also local health officers and other public health officials. This new expansion of the law allows these frontline workers to hide their addresses from the public in order to protect them from possible threats or violence.

Gov. Gavin Newsom’s Executive Order noted that “local health officers and other public health officials protecting public health during the COVID-19 pandemic have been subject to threats and other harassment including threats and harassment target at their places of residence, which threatens to chill the performance of their critical duties.” The Executive Order directs California Secretary of State Alex Padialla to establish a procedure to allow public health officials to participate in the California Safe at Home Confidential Address Program.

This shift is a significant milestone in expanding privacy rights of vulnerable populations, particularly when that vulnerability is a product of evolving public sentiment toward a particular group of people. ACPs shield home, work, and school addresses from public record searches and FOIA. Nationwide, about 41 states have some form of ACP. Most states restrict disclosure from public records, but seven states actually prohibit private companies from disclosing location information as well.

ACPs operate by providing alternate addresses for participants to use in place of their actual address. These designated addresses divert participants’ mail to a confidential third-party location (often a P.O. Box and/or a “lot number”), after which a state agency forwards the mail to the participant’s actual address.

As healthcare workers continue to battle on the front lines of the pandemic, it will be interesting to see if other states follow California’s lead. In the meantime, it is important to pay close attention to often-missed state privacy laws, such as ACPs. As the privacy landscape continues to evolve, these laws will continue to create a patchwork of regulations in the absence of sweeping federal privacy legislation. For now, California healthcare workers are now on the list of individuals whose privacy – and safety – often remain dependent on policy implementation at the state level.

Continue to look for further updates and alerts from Bradley on state privacy rights and obligations.

Schrems II, Part 2 — Additional Guidance for the Transfer of Personal Data Between the EU and U.S.Additional Impacts of the Invalidation of the EU-U.S. Privacy Shield

As previously advised, on July 16, 2020, the Court of Justice of the European Union (CJEU) issued a lengthy and detailed opinion invalidating the EU-U.S. Privacy Shield. The decision required immediate changes in the transfer of “personal data” between the European Union (EU) and the United States.

EU – U.S. Personal Data Protection

The General Data Protection Regulation (GDPR) was approved by the EU in 2016 and dramatically enhanced protections for EU personal data, including:

  • Requiring clear plain language for individual consents.
  • The right to the details of the use and processing of personal data.
  • The right to receive a copy of all personal data in a “commonly used and machine-readable format” – and to even have such provided to competitive parties.
  • The “the right to be forgotten” by the erasure, or termination of search links.
  • Notification of a breach within 72 hours.

The GDPR limits transfers of personal data of EU citizens outside the EU to only those countries that have the same level of data protection as the EU. Until the Schrems I and II decisions, businesses could transfer EU personal data into the U.S. under government-defined data protection regimes called the EU-U.S. Safe Harbor, and later the Privacy Shield.

EU Challenges to U.S. Privacy Protections

The U.S. Safe Harbor was initially challenged and invalidated by the CJEU in a case against Facebook, commonly referred to as “Schrems I.” Schrems brought a second action challenging the suitability of the EU-U.S. Privacy Shield, which was created to address the Safe Harbor issues. The CJEU’s July 16 “Schrems II” opinion invalidated the Privacy Shield but left open the use of GDPR “standard contractual clauses.”

Schrems II generally follows Schrems I in finding that there are insufficient protections against U.S. intelligence and/or law enforcement agencies obtaining personal data of EU citizens. The most significant difference is that Schrems II recognized privacy as a fundamental right of EU citizens – tantamount to an individual liberty protected by the U.S. Bill of Rights. It is this aspect of the Schrems II decision that is now generating additional guidance by EU data privacy agencies (DPAs) and enforcers, which further impacts how businesses can transfer personal data of EU data subjects going forward.

Various U.S. and EU officials initially made announcements that contractual GDPR privacy protection clauses – called “standard contractual clauses” – could still be used for the transfer of personal data between the EU and the U.S. Unfortunately, EU DPAs and EU enforcement officials are now issuing guidance advising that changes will be required in standard contractual clauses to protect the fundamental privacy right of EU citizens delineated in Schrems II from the perceived privacy threat from U.S. intelligence and law enforcement agencies.

Standard Contractual Clauses Guidance

Many U.S. businesses have utilized standard contractual clauses for the transfer of personal data from the EU. While the Schrems II opinion did not expressly invalidate the use of standard contractual clauses, it did establish that EU supervisory authorities are obliged to assess the compliance of such clauses within non-EU countries.

Immediately following Schrems II, the Data Protection Commission in Ireland and Federal Commissioner for Data Protection in Hamburg, Germany issued pronouncements questioning the adequacy of standard contractual clauses for transfers of existing EU personal data to the U.S.

On August 24 the DPA for Baden-Württemberg, Germany issued additional guidance on protections needed in standard contractual clauses for transfers of EU personal data to the U.S. More specifically, the German DPA recommended that standard contractual clauses for transfers from the EU to the U.S. include 1) the use of encryption where “only the data exporter has the key and which cannot be broken by US intelligence services;” and 2) anonymization of personal data that can only be correlated back to the data subject by the data exporter. The German DPA even provided a compliance checklist of recommendations, which mirrors recommendations that Bradley has previously provided to minimize cybersecurity risks:

  • Maintain a detailed schedule of data, data transfers and data locations – i.e., use of data maps.
  • Communicate the impact and effects of Schrems II with all service providers – i.e., proactively communicate compliance and contractual requirements to service providers to assure regulatory compliance and delineate contractual responsibility.
  • Research applicable local and federal laws in other jurisdictions – i.e., harmonize regulatory compliance and contractual responsibility where possible.
  • Determine whether a non-EU country has been found inadequate by the EU – i.e., keep current on the ever-changing cyber and privacy requirements of applicable jurisdictions.
  • Determine whether standard contractual clauses must be modified due to inadequate protections of a non-EU country – i.e., assure regulatory compliance and contractual responsibility with multi-jurisdictional requirements.

The Belgium DPA issued similar guidance on August 31, and other EU DPAs are likely to issue additional guidance in the coming months. We will continue to monitor for such announcements and provide updates accordingly.

In addition to the actions and guidance from EU regulators, there is already an effort to address the issue from a U.S. federal regulatory perspective. On September 3, the EU Justice Commissioner, speaking on behalf of the EU Parliament’s Committee on Civil Liberties, Justice and Home Affairs, advised that the EU is working with the U.S. to develop solutions for required protections – though from a U.S. perspective no action is likely until after the U.S. election in November.

Immediate Recommendations

  • Personal data transfers from the EU to the U.S. based solely on the EU-U.S. Privacy Shield must be suspended.
  • Personal data transfers involving Germany that are based on pre-existing standard contractual clauses should be suspended until clauses can be revised to reflect the guidance of the Baden-Württemberg DPA.
  • Continue to amend standard contractual clauses as required to comply with additional guidance from other EU DPAs.

Continue to look for further updates and alerts from Bradley to remain compliant with the collection, use, storage and transfer of personal data from the EU.

David Vance Lucas is a member of Bradley’s Intellectual Property and Cybersecurity and Privacy practice groups and leads the International and Cross Border team. Much of David’s experience was accumulated as general counsel for a multinational technology company. He now advises both U.S. and foreign clients on the harmonized application of U.S., U.K. and European laws, and represents clients in various legal proceedings in U.S. and foreign venues.

Gaining Momentum Outside California: Former Presidential Candidate Named Chair of CPRA Advocacy Group Advisory BoardWhen California voters head to the polls on November 3, 2020, they will decide whether to approve Proposition 24 — the California Privacy Rights Act (CPRA). If approved, the act would establish new privacy rights stronger than the recently enacted landmark California Consumer Privacy Act (CCPA).

Last week, the effort to approve the CPRA gained a prominent supporter as former Democratic Party presidential candidate Andrew Yang was named chair of the advisory board for Californians for Consumer Privacy, an organization that created and led the effort to enact the CCPA and is now advocating for the approval of the CPRA. Yang, who is an entrepreneur and was originally a corporate lawyer, began working in startups and early-stage growth companies as a founder or executive from 2000 to 2009. In 2011, he founded a nonprofit organization focused on creating jobs in cities struggling to recover from the Great Recession. He then ran as a candidate in the 2020 Democratic presidential primaries.

The CPRA would amend the CCPA and create new privacy rights and obligations in California, including:

  • The CPRA would establish a new category of “sensitive personal information,” which would be defined, among other things, to include a Social Security number, driver’s license number, geolocation and passport number.
  • The CPRA would grant California consumers the right to request the correction of their personal information held by a business if that information is inaccurate.
  • The CPRA would establish the California Privacy Protection Agency to enforce the law. The CCPA, by contrast, is enforceable by the California Attorney General’s Office.

Continue to look for further updates and alerts from Bradley on California privacy rights and obligations.