A Year Later: CCPA Compliance Reminder to Review Your Privacy PolicyHas it been a year already? Many businesses diligently made sure they did their best to hit the moving CCPA target as they welcomed 2020 and the effective date of the statute last year. A year ago, all we had were draft regulations and a statute, and businesses had to do their best to comply. So, it is not surprising many CCPA disclosures had an effective date of January 1, 2020.  But if that is the last time your disclosures were reviewed, your business is signaling non-compliance with CCPA.

Section 1798.130 contains the various lists required in the CCPA online privacy policy that includes categories collected and shared “in the preceding 12 months” (see, e.g., 1798.130(a)(5)(B) “a list of the categories of personal information it has collected about consumers in the preceding 12 months…”). In view of this time frame for the disclosure, the statute includes the affirmative requirement that a business must make these disclosures and “update that information at least once every 12 months…” (see 1798.130(a)(5)).

If you have not reviewed your CCPA disclosures since January 1, 2020, your business likely needs to do an additional review of your privacy policy based on the new regulations promulgated in the last 12 months. On January 1, 2020, businesses had to choose between trying to comply with original draft regulations released on October 10, 2019 or the high-level statutory requirements. In either case, the “final regulations” and subsequent modifications to those have provided additional requirements or in some cases removed requirements set out in the original proposed regulation. Therefore, when doing the required substantive review, it is recommended that the latest version of the regulations released on August 14,2020 (the “final regulations”) are consulted. Note that there are some slight modifications to even the final regulations that have been proposed but not yet approved. Here are a few notable differences between the original draft regulations and final regulations:

  • Section 999.305(a)(3) of the original draft regulations required a business to get “explicit consent” if it intended to use the personal information for a materially different purpose than what was explicitly disclosed at collection. This language contemplating “explicit consent” has been removed entirely in the final regulations.
  • The original draft regulations gave businesses the option to use the link title “Do Not Sell My Personal Information” or “Do Not Sell My Info.” While many chose the latter, this option was removed in subsequent revisions and only “Do Not Sell My Personal Information” remains as the specified link title (see, e.g., Section 999.305(b)(3)).
  • The original draft regulations did not exempt employees from the various disclosures. The final regulations clarified that employee disclosures were limited to notice at collection, did not have to have a “Do Not Sell” link or link to the privacy policy.
  • The original draft regulations had a process whereby a business that did not directly collect information from a consumer could only sell it by obtaining a signed attestation that the source provided a “Do Not Sell” link. This option was removed entirely and replaced with an implied requirement that a business that does not directly collect information must still provide a notice at collection if they sell the consumer’s personal information (see 999.305(d)).
  • The original draft regulations required an “interactive webform” as a method of submitting verified consumer requests but this affirmative requirement has been removed (see final regulations Section 999.312(a) that requires a toll-free number and one other method).

Although CCPA makes this type of annual review a regulatory requirement, privacy is an evolving area of law and privacy reviews should be built into the organizational compliance process of any business. Staying on top of privacy laws, regulations, guidance, and requirements is becoming even more important as we move into 2021, which is likely to be another landmark year for privacy. Stay tuned-in to these developments through Bradley’s Online and OnPoint blog by subscribing here.

Privacy Litigation Updates for the Financial Services Sector: Yodlee and Envestnet Sued for Data Disclosure and Processing PracticesConsumers are more aware than ever of data privacy and security issues. As technology develops, vast quantities of data are collected on individuals every minute of every day. Customers trust their institutions to keep the troves of financial data on them private and secure.

Wesch v. Yodlee, Inc. and Envestnet, Inc.

A recent class action lawsuit filed against Yodlee and its parent company, Envestnet, puts a spotlight on the data-sharing practices of consumer financial information. The plaintiff, Deborah Wesch, alleges in the complaint that Yodlee’s business practice of collecting, extracting, and selling personal data violates several privacy laws, such as the California Consumer Privacy Act (CCPA), California’s Financial Information Privacy Act (CalFIPA), the California Online Privacy Protection Act (CalOPPA), and the Gramm-Leach Bliley Act (GLBA) Privacy Rule. The complaint brings to light an issue that many financial institutions have and continue to grapple with under a dual state and federal privacy regime — namely whether Yodlee’s data aggregation practices are subject to CCPA or whether Yodlee qualifies as a “financial institution” under GLBA and CalFIPA. To further complicate the issue – even if Yodlee does qualify as a financial institution, .

Even if Yodlee does qualify as a financial institution, the information would need to be collected pursuant to GLBA and CalFIPA, in order to fit within the narrow exception provided under CCPA.

Based on the complaint, Yodlee is a data aggregation and data analytics company. It obtains access to individuals’ financial data through its Application Programming Interface (API) software, which is used to connect financial apps and software to third-party service providers. This software is integrated into the platforms of some of the largest financial institutions in the country.

The plaintiff alleges that the software is “silently integrate[d] into its clients’ existing platforms to provide various financial services.” This silent integration is significant because “the customer believes that it is interacting with its home institution (e.g., its bank) and has no idea it is logging into or using a Yodlee product.” But when the customer enters their bank login information, Yodlee “stores a copy of each individual’s bank login information (i.e., her username and password) on its own system after the connection is made between that individual’s bank account and any other third-party service (e.g., PayPal).”

Once Yodlee has access to the individual’s account, the plaintiff alleges that Yodlee routinely extracts data from the user’s account, even after an individual has severed the connection between its bank account and the third-party service. After access is revoked, Yodlee accesses the account by relying on their own stored copy of the individual’s credentials. Then, Yodlee allegedly aggregates the data, including bank account balances and transaction histories, and sells it to third parties for a fee.

The plaintiff alleges that when she connected her bank account to PayPal through Yodlee’s account verification API, she did not receive the appropriate disclosures about Yodlee’s business practices of storing her account log-in information. Yodlee then continuously accessed and extracted information from her account and sold her personal data to third parties without her knowledge or consent. Even though “PayPal discloses to individuals that Yodlee is involved in connecting their bank account to PayPal’s service for the limited purpose of confirming the individual’s bank details, checking their balance, and transactions, as needed,” the plaintiff alleges that “Yodlee’s involvement with the individual’s data goes well beyond the limited consent provided to facilitate a connection between their bank account and PayPal.”


Banks and other businesses that deal with financial information have unique privacy considerations that should be evaluated in light of this pending case. First and foremost, businesses should re-evaluate their data-sharing practices with third parties that are known data aggregators by reviewing their contracts with such third-party service providers. Businesses often rely on legal bases to share information with third parties. Businesses should review these legal bases to ensure compliance with applicable privacy laws. More specifically, businesses that qualify as “financial institutions” under GLBA and/or CalFIPA should evaluate what legal basis they are relying on when sharing customer information with non-affiliate third parties.

  • If the service provider exception is the legal basis relied upon to share data, businesses should confirm their contracts properly impose the limitations required by applicable privacy laws, such as those required by GLBA or CalFIPA.
  • Moreover, if the consent or authorization exception is the legal basis relied upon to share data, businesses should ensure that they have received consent as defined by, and in accordance with, applicable privacy laws.
  • However, if the business has taken the position that the information being shared does not qualify as “nonpublic personal information” or “personally identifiable financial information,” the business should review the relevant definitions under applicable privacy laws to ensure that such information does not fall within the scope of nonpublic personal information.
  • Alternatively, if the business relies on the information being extracted by the third party as being de-identified data, the business should take steps to ensure that the data truly meets the applicable standard for de-identification under all applicable privacy laws.

Even if the sharing is permitted under applicable privacy laws, businesses should consider potential claims brought under California’s Unfair Competition Law when designing the interaction of their application with a third-party processor’s application. Particularly when a consumer enters their credentials to link an app or to verify a bank account, if the screen displays the bank’s logo, it may cause consumers to believe they are entering their information on the bank’s secure portal rather than providing their credentials to a third party. Banks should make it clear to consumers that they are interacting with an outside third-party.

Lastly, aside from the legal implications of sharing customer information, businesses should also consider the reputational risk to certain data-sharing practices. Stay tuned for a follow-up blog discussing the case decision and its impact on the privacy and financial services sector.

Hanna Andersson and Salesforce Receive Preliminary Approval for Settlement of CCPA-Based Class Action LitigationIn 2019, Hanna Andersson, a children’s apparel store, suffered a data breach while using a Salesforce e-commerce platform. As a result of the breach, customers filed a class action lawsuit, alleging customer data was stolen and asking that both Hanna Andersson and Salesforce be held liable under the California Consumer Protection Act (CCPA).


Barnes v. Hanna Andersson and Salesforce (4:20-cv-00812-DMR) was one of the first cases filed under the newly effective CCPA, and it has garnered much attention from privacy experts and attorneys alike. According to the complaint, the data breach allegedly occurred from September 16, 2019, to November 11, 2019, during which time hackers collected sensitive consumer information, such as customer names, billing and shipping addresses, payment card numbers, CVV codes, and credit card expiration dates. On December 5, 2019, law enforcement found this information on the dark web and alerted Hanna Andersson, which then investigated the incident and confirmed that Salesforce’s platform was “infected with malware.” Hanna Andersson reported this breach to customers and the California attorney general on January 15, 2020.

Plaintiffs allege that the breach was caused by Hanna Andersson’s and Salesforce’s “negligent and/or careless acts and omissions and failure to protect customer’s data … [and failure] to detect the breach.” Plaintiffs further allege that, as a result of the breach, Hanna Andersson’s customers “face a lifetime risk of identity theft.”

Moreover, Bernadette Barnes, the named plaintiff in the case, alleges that she now experiences anxiety as a result of time spent reviewing the “account compromised by the breach, contacting her credit card company, exploring credit monitoring options, and self-monitoring her accounts.” Barnes also claims to now feel hesitation about shopping on other online websites.


In December 2020, the court preliminarily approved the class action settlement filed by the plaintiffs. This settlement included both monetary and non-monetary requirements. First, a $400,000 settlement fund that will provide cash payments of up to $500 per class member, with expense awards of up to $5,000 available to class members with extraordinary circumstances, such as rampant identity theft. The actual payment to the average class member is not ascertainable now since it will vary depending on the ultimate size of the class, however it is expected to be approximately $38 per class member. Second, the settlement requires Hanna Andersson to improve its cybersecurity through, but not limited to, the following measures: hiring a director of cybersecurity, implementing multi-factor authentication for cloud services accounts, and conducting a risk assessment consistent with the NIST Risk Management Framework.


At this point, it’s unclear whether these requirements should be viewed as setting an industry standard for compliance or for setting minimum practices. For example, multi-factor authentication has long been considered an industry standard and an order for its implementation seems more like an indictment of Hanna Andersson’s practices than the creation of a new and more robust standard. Additionally, it is noteworthy that the court ordered Hanna Andersson to hire a director of cybersecurity and not a chief information security officer (CISO). While at first glance these seem like simple differences in title, a CISO is an executive-level position that typically plays an enterprise-wide role in developing and implementing privacy and cybersecurity policies, while also responding to any incidents that may occur. Many consider a CISO role that reports directly to the CEO to be an industry best-practice. In comparison, a director of cybersecurity is not an executive-level position, rather it is a role that may often report to a CISO or be siloed within an Information Technology department. Typically, the director of cybersecurity has less authority and power to shape policies enterprise-wide.

Regardless, this settlement will set the stage for any upcoming CCPA-related privacy and cybersecurity disputes. Furthermore, this settlement will provide insight into who may be sued under CCPA, specifically whether third-party processors may be brought into litigation going forward. In light of this decision, businesses should compare their privacy and cybersecurity practices to the settlement requirements, while bearing in mind that these represent the minimum for compliance, not necessarily the industry standard.

Continue to look for further updates and alerts from Bradley on state privacy rights and obligations.

FTC Eyes Vendor Oversight in Safeguards Rule SettlementOn December 15, 2020, the FTC announced a proposed settlement with Ascension Data & Analytics, LLC, a mortgage industry analytics company, related to alleged violations of the Gramm-Leach-Bliley Act’s (GLBA) Safeguards Rule. In particular, the FTC claimed that Ascension Data & Analytics’ vendor, OpticsML, left “tens of thousands of consumers[’]” sensitive personal information exposed “to anyone on the internet for a year” due to an error configuring the server and the storage location. The FTC contended that Ascension Data & Analytics failed to properly oversee OpticsML. The FTC voted 3-1-1 to issue the administrative complaint and to accept the consent agreement, with the full consent agreement package to be published soon in the Federal Register for the 30-day comment period.

As detailed in the FTC’s complaint, Ascension Data & Analytics contracted with OpticsML to OCR mortgage documents. OpticsML stored the information on a cloud-based server and in a separate cloud-based storage location. Due to a configuration issue, the database was publicly exposed, meaning anyone could access the personal information without the need for credentials. Although Ascension Data & Analytics required vetting security measures of its vendors in its “Third Party Vendor Risk Management” policy, which was in place since September 2016, the FTC claimed that Ascension Data & Analytics never vetted OpticsML. Additionally, the FTC asserted that since at least September 2016, Ascension Data & Analytics never required its service providers to implement privacy measures to protect personal information.

The proposed settlement contains multiple action items for Ascension Data & Analytics to complete. Ascension Data & Analytics must establish and implement a data security program, engage an independent third-party professional to assess the procedures on an initial and biennial basis, and certify annually to the FTC its compliance with the settlement. As part of the mandated data security program, Ascension must not only conduct initial due diligence on any vendor with access to consumer data, but it must also conduct an annual written assessment of each vendor “commensurate with the risk it poses to the security of” personal information.


There are three big takeaways from the complaint and settlement.

  • First, the FTC is ramping up enforcement of the Safeguards Rule. This is not much of a surprise given the FTC’s focus on the Safeguards Rule, as evidenced by the virtual workshop it hosted this summer to discuss the proposed changes to the rule.
  • Second, the FTC appears to see vendor oversight as a key component of implementation of the Safeguards Rule. While other agencies, such as the CFPB, have indicated a specific interest in vendor oversight, this is now on the FTC’s radar.
  • Finally, this settlement underscores that regulated entities need to actively operationalize written policies and procedures, particularly around third-party risk. Financial institutions should ensure that they are in compliance with the Safeguards Rule generally and also engage in initial due diligence and continuous oversight of their vendors in order to avoid enforcement based on their vendors’ conduct.

Continue to look for further updates and alerts from Bradley relating to privacy rights and obligations.

Massachusetts Voters Approve Measure for Expanded Access to Vehicle DataIn a roller coaster of an election week, it was easy for smaller ballot measures to become overshadowed. One ballot measure that you may have missed is Massachusetts’s Ballot Question 1 regarding the “right to repair” motor vehicles. The ballot measure expands access to a driver’s motor vehicle data. Vehicles are increasingly becoming more computerized due to the use of “telematics systems.” Telematic systems are systems that collect and wirelessly transmit mechanical data to a remote server. The types of mechanical data captured can include location, speed, braking, fuel consumption, vehicle errors, and more. Ballot Question 1 is a measure to expand access to such mechanical data.

Beginning with model year 2022, manufacturers of motor vehicles sold in Massachusetts that use telematics systems must equip the vehicles with a standardized open access data platform. This will allow owners and independent vehicle repair facilities to access the data. Owners will be able to easily view mechanical data on their cellphones. Ballot Question 1 will also give owners more options for car repair services. Local vehicle repair shops will now have access to data that is usually restricted to dealers.

Opponents of Ballot Question 1 are concerned about the widened net of access to a driver’s data, especially location data. There are also concerns about the increased risk for security breaches. Privacy advocates have long voiced concerns about large-scale collection of location data. In fact, the New York Times Privacy Project has published a series of articles about the dangers of unregulated collection of location data.

With the influx of cars with telematic systems hitting the market, ballot measures surrounding access to vehicle data will likely increase in the next few years. However, with counter-veiling privacy pushes to regulate or limit the use of location data, we may also see a tug of war between various laws that seek to provide access to location data and those who seek to regulate the collection and use of that information.

Continue to look for further updates and alerts from Bradley on state privacy rights and obligations.

No Unreasonable Searches or Seizures of Electronic Data in MichiganThe most intimate information can be found in the data on our cellphones and laptops, from geo-location data to search history. The level of privacy protections afforded to electronic data and communications have been unclear and ambiguous for years, but after this election, Michigan now has some clarity.

On November 03, 2020, Proposal 2 was passed by an overwhelming majority, amending Article 1, Section 11 of Michigan’s Constitution. Proposal 2 will provide personal protections by prohibiting unreasonable searches or seizures of electronic data and communications. It will do so through requiring law enforcement to obtain a search warrant to access an individual’s electronic data and communication under the same conditions needed to search through an individual’s home or seize an individual’s papers or physical items.

The amendment will fill a gap in the Michigan Constitution. Before the passing of Proposal 2, the Michigan Constitution provided protections against unreasonable searches and seizures for physical items but did not explicitly mention electronic data or communications. With the amendment to the Michigan Constitution, there will be no more ambiguity over whether data — such as cell phone data, communications with Alexa and Google Home, or Apple Watch health metrics — is private and requires a warrant to be searched or seized by law enforcement in Michigan.

Michigan is not the first state (and likely won’t be the last) to provide some level of protection for electronic data and communications. Utah has enacted a law banning warrantless searches and seizures of electronic data and communications. Moreover, the Florida Constitution mentions “the right of the people to be secure … against the unreasonable interception of private communications.” However, both Utah and Florida’s protections are not as strong as Michigan’s explicit protections in a constitutional amendment. Going forward, it is likely that more states will follow Michigan’s lead in protecting electronic data and communications through constitutional amendments.

Continue to look for further updates and alerts from Bradley on state privacy rights and obligations.

New “Basic Assessment” Is a Bridge to CMMC for Defense ContractorsThe Department of Defense (DoD) continues to enhance cybersecurity requirements in its supply chain. A new rule requires some contractors to assign a numerical score to their current cybersecurity practices. Additionally, the rule begins rolling out requirements for all defense contractors to have their cybersecurity certified by a third party.

For years, the gold standard for defense contractors has been NIST SP 800-171 (the NIST Standard). The NIST Standard establishes cybersecurity practices for companies that handle DoD “controlled unclassified information” (CUI). Historically, the NIST standard was largely aspirational, and contractors have been allowed to self-certify that they either comply or have a plan to comply in the future. That looseness led to varying degrees of — and endlessly delayed plans for — compliance.

To address those shortcomings, the Cybersecurity Maturity Model Certification (CMMC) Framework will end the self-certification option. Instead, contractors will need certification from a CMMC Third-Party Assessment Organization (C3PAO). C3PAOs must themselves be accredited by an Accreditation Body. The CMMC Framework is being rolled out over the next five years, starting November 30, 2020. But no C3PAOs have yet been accredited, so it will be a while before contractors can be CMMC certified.

In the meantime, as a bridge to CMMC, the rule establishes a more robust assessment framework for the NIST Standard. Rather than self-certify compliance, contractors must specifically score their practices according to a detailed list of controls, on a scale from -‍203 to +‍110. This “Basic Assessment” score will be posted to the Supplier Performance Risk System (SPRS), where it can affect procurement decisions. That gives contractors additional incentive to comply with the NIST Standard sooner, rather than later. Because of substantial overlap between the NIST Standard and the CMMC controls, the scoring could smooth the eventual transition to the CMMC Framework. As Bradley has reported, this Basic Assessment is due from covered defense contractors by November 30, 2020.

NIST SP 800-171 DoD Assessment Methodology

DoD contractors are already familiar with the Defense Federal Acquisition Regulation Supplement (DFARS) cybersecurity clause 252.204-7012. It is included in nearly all DFARS-covered contracts and requires that contractors’ cybersecurity meet the NIST Standard. Historically, the DoD had no way to verify a contractor’s implementation of the standard. The new interim rule creates a more specific, standardized self-assessment methodology.

Under this new NIST SP 800-171 DoD Assessment methodology, contractors still self-assess their compliance. What’s new is the standardized, uniform methodology to be used for assessment, in which a contractor scores itself on a scale from -203 to +110, based on the controls with which it complies. In addition to the basic assessment — which is a self-assessment — after award, the government may in some cases conduct its own medium or high assessment of a contractor’s cybersecurity. Assessments generally expire after three years.

The basic assessment applies across the board to all defense contractors who handle DoD CUI.

CMMC Framework

The industry has been preparing for CMMC since last year, so its entry into the DFARS comes as no surprise. As discussed above, no C3PAOs have yet been accredited. Nonetheless, efforts to comply with the CMMC standards will not be wasted.

CMMC has five levels of compliance, and the required compliance level will be defined in each contract based on the associated risks. Every contractor will need to meet CMMC Level 1. And any contractor handling CUI will probably need to meet at least CMMC Level 3. The Level 3 requirements remain very similar to the NIST Standard. For many contractors, then, their CMMC efforts will help improve their Basic Assessment score.

Risks of Noncompliance

Though the Basic Assessment is a self-assessment, defense contractors should score themselves honestly. In the Eastern District of California, claims are proceeding against Aerojet RocketDyne under the False Claims Act (FCA) for its allegedly false representations of cyber compliance. In allowing the suit to move forward, the court recognized the possibility that “the government never expected full compliance.” However, the specific “extent to which a company was technically compl[ia]nt still mattered.” Earlier the Ninth Circuit had allowed similar FCA claims to proceed against Raytheon in another closely watched case. Those claims have now been finally dismissed. But both the Raytheon and Aerojet RocketDyne cases gesture toward the importance of honest and accurate assessments of cyber compliance.


The DoD recognizes the global importance of strong cybersecurity. This new rule shows that the government is tightening not only the security requirements. Indeed, the assessment framework and reporting requirements are equally important to ensuring robust controls and compliance. But many defense contractors — especially small businesses — may chafe under some of the new requirements. Assessing, conforming, and certifying its systems and procedures could cost even a relatively small contractor many tens of thousands of dollars. Bradley will continue to report on new developments.

If you have any questions about the topics discussed in this article or any related issues, please feel free to contact Aron Beezley, David Lucas, or Andrew Tuggle.

2020 Brings Times of Change: Key Privacy Law Updates This YearThe privacy law landscape is constantly changing, and it can feel like a daunting task for businesses to keep up with the laws of 50 states in the U.S. plus any international laws that also may be applicable. 2020 seems to be a banner year for change on many fronts. COVID-19 and the 2020 elections have caused profound changes this year, but for those who are affected by changing privacy laws, this has been a remarkable year of change as well.

For example, the California Consumer Privacy Act (CCPA) went into effect on January 1, 2020; the final regulations under CCPA were approved by the California Office of Administrative Law in August of 2020; and shortly thereafter, the November elections brought additional change with the passage of the California Privacy Rights Act (CPRA). CPRA does not go into effect until January 1, 2023, however, it does have a one-year lookback, which means that companies will need to be largely in compliance by January 1, 2022. Additionally, anyone who has implemented CCPA or GDPR, will attest to how quickly two years can fly by when attempting to understand the multitude of changes imposed by a comprehensive privacy law like CPRA. The culmination of new requirements and broad scope of CPRA will need to be understood and implemented into privacy policies and procedures going forward in an effort to ensure compliance on January 1, 2023.

However, the CCPA/CPRA changes are only one example of the consumer data privacy legislation changes this year. According to the National Conference of State Legislatures, in 2020, bills relating to consumer data privacy legislation were considered in at least 30 states and in Puerto Rico (see NCSL 2020 Consumer Data Privacy Legislation). Though most of these bills were not passed, the fact that these bills were considered is an indicator of the interest in protection of consumer data and seems to foreshadow an increase in privacy regulation in the future.

From an international perspective, 2020 also brought the invalidation of the EU – U.S. Privacy Shield framework by Schrems II, which caused many businesses to have to rethink their approach to transfers of personal data between the European Union or United Kingdom and the U.S. (see Schrems II, Part 2 – Additional Guidance for the Transfer of Personal Data Between the EU and U.S.). Schrems II did not invalidate the use of Standard Contractual Clauses (SCCs) for transfer of data but it did call into question whether the SCCs are adequate to address the risks associated with data transfers to a non-EU country. The data exporter may need to apply supplementary measures, in addition to SCCs, if needed to protect the personal data when transferred. Supplemental measures can include encryption, anonymization, and pseudonymization, as well as other tools. Schrems II requires that businesses analyze the protections currently in place for data transfers between the EU or the UK and the U.S. to ensure compliance.

Awareness of these changes and implementing privacy policies and practices that protect your business are key during these changing times. Continue to rely on Bradley to keep you up to date on privacy rights and obligations.

Privacy at the Polls: Portland, Maine Votes to Ban Facial Recognition TechnologyWhile the nation waits for the results of the presidential race to be tallied, across the country local and statewide referendums on privacy issues have been decided. In Portland, Maine voters approved a ballot measure to ban the use of facial recognition technology by local police and city agencies. Portland joins other cities such as Boston, San Francisco, and (the other) Portland, Oregon that have already banned the use of this technology.

Facial recognition is becoming an increasingly common method of identifying or verifying identity of an individual using their facial features. Facial recognition software is often particularly bad at recognizing minorities, women, and younger people, often misidentifying them, which can disparately impact certain groups —oftentimes in serious ways when this technology is used by law enforcement or government agencies.

Portland, Maine’s ballot initiative added teeth to an ordinance passed by the Portland city council in August 2020. That ordinance included the ban on facial recognition technology but did not include any enforcement measures. The recently passed ballot initiative includes measures allowing citizens to sue the city for violations, requiring the city to suppress evidence illegally obtained through the use of this technology, and making violations of the ordinance by city employees grounds for suspension or termination. The initiative additionally provides for up to $1,000 in penalties for violations, plus attorneys’ fees.

With the rise of facial recognition technology, many advocates have warned of the potential privacy and abuse implications, especially when such technology is employed by law enforcement or state agencies. Portland, Maine’s vote to approve this ban may be a signal of what’s to come in other cities across the country.

Continue to look for further updates and alerts from Bradley on state privacy rights and obligations.

Introducing… the Global Privacy ControlOne of the most reoccurring questions we’ve gotten from companies subject to CCPA that have a “Do Not Sell” link has been “What the heck do we do about this global privacy control?” Up until now, there wasn’t a clear, or even semi-helpful, answer to that question that didn’t involve a fair amount of guesswork. We now have our answer — the aptly named “global privacy control” — but what exactly does it mean?

This concept of “user-enabled global privacy controls” was introduced in the CCPA regulations and left companies scratching their heads as to what it meant. Specifically, Section 999.315(c) states:

If a business collects personal information from consumers online, the business shall treat user-enabled global privacy controls, such as a browser plug-in or privacy setting, device setting, or other mechanism, that communicate or signal the consumer’s choice to opt-out of the sale of their personal information as a valid request submitted pursuant to Civil Code section 1798.120 for that browser or device, or, if known, for the consumer.

The use of the word “shall” coupled with the seemingly unascertainable scope of this provision understandably got the attention of those tasked with CCPA compliance. Based on a literal reading, a business has to somehow monitor for the development of any type of mechanism that might provide an opt-out and recognize it or risk being considered non-compliant with CCPA. One caveat, subsection (1), provided that “[a]ny privacy control developed in accordance with these regulations shall clearly communicate or signal that a consumer intends to opt-out of the sale of personal information.” So, businesses only have to monitor for every possible mechanism that “clearly” communicates or signals an intention. This was an added revision made to the original draft regulations, so presumably the regulators see this as a meaningful limitation. Nevertheless, there remains no apparent limitation on a business’ obligation to proactively monitor for highly technical implementations that a business may have no internal capability to address, even if it identifies such a global privacy control. For those wrestling with this dilemma there was a temporary measure of comfort. Specifically, in the Final Statement of Reasons for the CCPA Regulations, the OAG stated that the subsection cited above “is forward-looking and intended to encourage innovation and the development of technological solutions to facilitate and govern the submission of requests to opt-out” (see FSOR at p. 37). So we knew, at the very least, the OAG had no signals in mind at the time and businesses were not expected to be processing any.

Unfortunately, it would appear that the window of comfort is coming to a close. A number of organizations, including the likes of DuckDuckGo, the Electronic Freedom Frontiers, Mozilla, the NY Times and the Washington Post, are implementing the aptly named “global privacy control” (GPC) specification. This specification explicitly references this provision of the CCPA regulations stating “[t]he GPC signal will be intended to communicate a Do Not Sell request from a global privacy control, as per CCPA-REGULATIONS §999.315.” Given the express intent and the industry players involved, it would appear that this is the first foray into the user-enabled global privacy control. Businesses that have a “Do Not Sell” link should take note and begin to determine how they can comply.

Even though this one has cornered the market on the name, it is highly doubtful this will be the last user-enabled control to signal a user’s intent to opt-out, so businesses need to dedicate resources to addressing this evolving issue.