A Fintech Leader’s Thoughts on the North Carolina Regulatory Sandbox Act

As part of Bradley’s continuing coverage of the North Carolina Sandbox Act, we wanted to know what community members and NC fintech aficionados thought about this proposed legislation. We posed six questions to Tariq Bokhari, an influential leader in the financial technology (fintech) industry, who serves as the executive director of the Carolina Fintech Hub (CFH). Read more of our conversation below on how this regulatory sandbox will impact North Carolina’s fintech industry.

Bradley: How will the NC Regulatory Sandbox Act affect fintech companies generally?

Bokhari: The premise behind the NC Regulatory Sandbox Act (Innovation Sandbox) is that innovators and startups in tech 1) have difficulty piloting new ideas in a fail-fast manner due to a regulatory system not designed for that, and 2) are viewed and set up as disruptive forces to incumbent stakeholders, rather than opportunities to partner with those incumbents in a win-win scenario. The Innovation Sandbox is designed to create tools that decrease both of those headwinds that are pervasive across the country, and in doing so create a competitive advantage for our region and all that reside within it. The Carolina Fintech Hub has championed this effort for several years now, and found like-minded partners like the NC Blockchain Initiative, because we strongly believe being the most entrepreneurial and nimble of the 50 states will position us as global leaders in technology and innovation.

Bradley: What products and services are applicable for this program?

Bokhari: Its scope can truly be anything that touches technology, although the initial focus will be on fintech, insurtech and blockchain. I envision this program being expanded after one or two years to include other areas, like possibly securities, thus making the program more comprehensive.

Bradley: With regulatory sandboxes already being set up for other states’ finance and insurance economies, do you see a possible playbook for North Carolina’s fintech industry?

Bokhari: There are a few unique differences with NC’s Innovation Sandbox, including our unique focus on promoting the partnership between our incumbents and startups rather than disruptive friction between them.

Bradley: Can you elaborate on what makes NC’s Innovation Sandbox unique?

Bokhari: With its unique sandbox approach, North Carolina decided to start simple while allowing for natural evolution, namely by embedding formalized accountability around innovation across NC via an Innovation Commission.

This Innovation Commission is designed to be centralized (not embedded in any one state agency), cross-representative (to maximize collaboration), lightweight in its design (very simple in its mandate) and serve as a clearing house of innovation requests and ideas (sourced from the industry with the help of established non-governmental organizations (NGOs)).

The Innovation Commission is really envisioned to have only two major tasks: 1) review the requests of those who apply to participate in any of the sandbox’s available tools, and if it deems the requests to have merit, route them to the appropriate regulating agency or agencies for ultimate decisioning; and 2) review requests to create new tools that enable further innovation, and if it deems the requests to have merit, route them to the appropriate regulating agency or agencies for ultimate decisioning.

Bradley: How will this proposed legislation change economic development in North Carolina?

Bokhari: In its simplest form, this legislation will create an Innovation Commission that will give North Carolina a significant advantage over every other state in recruiting and retaining tech companies. These companies will be able to perform certain activities with reduced governmental red tape here.

Bradley: How does Carolina Fintech Hub plan to help their fintech partners strike that balance between protecting consumers while promoting emerging fintech technologies and innovations in the field?

Bokhari: Defining the tools is the most challenging task for any state to address, so the “secret sauce” in our approach is not trying to assume what set of tools is needed up front. Instead of assuming, we use a platform that can react to the market demands for tools as they are recognized in this formal Innovation Commission structure, while still operating within the confines of the existing regulatory agency construct to avoid unneeded or complicated friction.

The proposed legislation envisions a tool that has also been incorporated in other states’ sandbox efforts to date. This tool would enable small-scale piloting of innovations without having to apply for what may otherwise be cumbersome licenses or having to build out large-scale compliance programs for certain regulatory frameworks.

In addition to the tool described above, there are two additional tools that are in the hopper for near term exploration when the Innovation Commission is established: 1) after successful completion of the centralized sandbox program, a startup receives a limited scope “stamp of recognition” that can provide additional confidence to incumbent banks and institutions and their vendor risk management processes when they contemplate engaging the startup; and 2) a blockchain Innovation Sandbox use case. This still requires significant design and vetting. And the goals here are aspirational: I envision, not only startups that are freestanding entities taking advantage of the Innovation Sandbox, but also startups acquired or established as affiliates or subsidiaries by big financial institutions utilizing the sandbox as well, as the model matures.

Bradley: Do you anticipate the Regulatory Sandbox Act will slow the pace of companies integrating emerging technology, including blockchain technology, in North Carolina’s fintech space? And will the act attract companies to relocate to NC?

Bokhari: I am highly confident this legislation will multiply the pace of innovative, emerging technology across NC, as well as our ability to recruit nationally and internationally, for a simple reason: Companies will be able to operate with less friction, and capitalize on more partnerships with our incumbents in NC, more so than in any other state. I am most interested in the blockchain aspects of the regulation. I am seeing a spike in smart contract and crypto activity lately, but most places across the country don’t even know this activity is happening, let alone have a sophisticated system to champion it statewide.

Bradley is closely monitoring this legislation and will provide continuing coverage of the proposed bill in the coming weeks and months. If passed, the legislation could become effective October 1, 2021.

Another Data Privacy Law? Colorado Enacts the Colorado Privacy ActColorado became the third state to enact comprehensive data privacy legislation when Gov. Jared Polis signed the Colorado Privacy Act (CPA) on July 8, 2021. The CPA shares similarities with its stateside predecessors, the California Consumer Privacy Act (CCPA), the California Privacy Rights Enforcement Act (CPRA), and the Virginia Consumer Data Protection Act (VCDPA), as well as the European Union’s General Data Protection Regulation (GDPR). But the CPA’s nuances must be considered as companies subject to these statutes craft holistic compliance programs.

The CPA goes into effect on July 1, 2023. But, given its complexity, the time for companies to start preparing is now. Here are some answers to questions about the scope of the new law, the consumer rights it provides, the obligations it imposes on businesses, and its enforcement methods.

Does the CPA apply to my business?

The CPA’s jurisdictional scope is most like the VCDPA’s. The CPA applies to any “controller” – defined as an entity that “determines the purposes for and means of processing personal data” – that “conducts business in Colorado” or produces or delivers products or services “intentionally targeted” to Colorado residents and either (1) controls or processes the personal data of 100,000+ “consumers” each calendar year; or (2) controls or processes the personal data of 25,000+ consumers and derives revenue or receives discounts from selling personal data.

“Personal data” is defined as “information that is linked or reasonably linkable to an identified or identifiable individual” other than “publicly available information” or “de-identified data.” The CPA defines a “consumer” as a “Colorado resident acting only in an individual or household context.”

The CPA provides entity-level exemptions to air carriers and national securities associations, among others. Unlike the CCPA, CPRA, and VCDPA, the CPA does not provide an entity-level exemption to non-profit organizations.

How do I handle consumer requests regarding their personal data?

The CPA provides consumers with the right to submit authenticated requests to a controller to (1) opt-out of certain processing of their personal data; (2) access their personal data and confirm if it is being processed; (3) correct inaccuracies in their personal data; (4) delete their personal data; and (5) obtain their personal data in a portable format. A controller must inform the consumer of any actions taken or not taken in response within certain timelines.

Like the VCDPA and unlike the CCPA and CPRA, the CPA provides consumers with the right to appeal a controller’s decision concerning an authenticated request. Controllers must set up internal processes for handling such appeals.

What are a consumer’s opt-out rights?

The CPA provides consumers with the right to opt-out of the processing of personal data for: (1) sale; (2) targeted advertising; or (3) profiling. The final two opt-out rights are also found in the VCDPA and CPRA, but not the CCPA.

Like the CCPA’s definition, the CPA’s definition of the “sale” of personal data is broad: “the exchange of personal data for monetary or other valuable consideration by a controller to a third party.” But the CPA’s exceptions to this definition are much broader. Under the CPA, a controller does not sell personal data by disclosing personal data (1) to an affiliate; (2) to a “processor” that processes the personal data on the controller’s behalf; (3) to a third party for “purposes of providing a product or service” that the consumer requests; (4) that a consumer “directs the controller to disclose or intentionally discloses by using the controller to interact” with a third party; or (5) that the consumer “intentionally made available … to the general public via a channel of mass media.”

When do I have to obtain opt-in consent from a consumer?

The CPA requires that a controller obtain opt-in consent before processing (1) “sensitive” data; (2) the personal data of a “known” child;  or (3) personal data “for purposes that are not reasonably necessary to or compatible with” the processing purposes that the controller previously specified to the consumer. To provide the requisite “consent,” a consumer must make a “clear, affirmative act” that signifies their “freely given, specific, informed, and unambiguous agreement” to the processing.

Importantly, a consumer accepting broad terms of use or a “similar document that contains descriptions of personal data processing along with other, unrelated information” does not constitute consent. Nor does an agreement obtained through “dark patterns,” which the CPA defines as user interfaces “designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.”

What does my privacy notice have to say?

A controller must provide consumers with a privacy notice that is “reasonably accessible, clear, and meaningful.” All privacy notices must include the following information: (1) the categories of personal data collected or processed; (2) the purposes for which personal data are processed; (3) the categories of personal data shared with third parties; (4) the categories of third parties with whom the controller shares personal data; and (5) how a consumer can submit authenticated requests and appeals regarding such requests.

If a controller sells personal data or processes such data for targeted advertising, the privacy notice must “clearly and conspicuously disclose” that fact and how consumers can opt-out. The “opt-out method” also must be provided in a separate location that is “clear, conspicuous, and readily accessible.”

Do I have to perform data protection assessments?

Similar to the VCDPA and GDPR, the CPA requires that controllers conduct and document a “data protection assessment” regarding each of its processing activities that: (1) involves personal data acquired on or after July 1, 2023; and (2) presents a “heightened risk of harm” to a consumer. Processing that presents such a heightened risk includes (1) selling personal data; (2) processing sensitive data; (3) processing personal data for targeted advertising; or (4) processing personal data for profiling that presents a “reasonably foreseeable risk” of certain consumer harms.

Among other requirements, a data protection assessment must “identify and weigh” the benefits of the processing to the “controller, the consumer, other stakeholders, and the public” against “the potential risks to the rights of the consumer,” as “mitigated by safeguards that the controller can employ to reduce the risks.”

What are my data minimization and security requirements?

A controller’s “collection of personal data must be adequate, relevant, and limited to what is reasonably necessary” to the processing’s purposes that have been disclosed to the consumer. As noted above, a controller cannot process for another purpose without the consumer’s consent.

A controller must take “reasonable measures” to secure personal data from “unauthorized acquisition” during storage and use. These data security practices must be appropriate for the “nature” of the controller’s business and the “volume, scope, and nature of the personal data processed.”

How is the CPA enforced?

Unlike the CCPA, the CPA does not provide a private right of action. It is enforceable only by Colorado’s attorney general and district attorneys. CPA violations constitute a deceptive trade practice and are thus subject to civil penalties of up to $20,000 per violation.

Until January 1, 2025, the attorney general or district attorney must provide notice of a violation, which triggers a 60-day cure period. If the controller fails to cure the violation within this period, the attorney general or district attorney may initiate an enforcement action.

Conclusion

While the CPA’s similarities to predecessor privacy statutes will allow companies to leverage their current compliance efforts to obtain CPA compliance, the statute’s enactment nonetheless adds another layer to already onerous data privacy obligations.

Bradley’s Cybersecurity and Privacy team is here to help. Stay tuned for further updates and alerts on privacy-law developments by subscribing to the Online and OnPoint blog.

Technology Boom in NC? What You Should Know About the Proposed Regulatory Sandbox in the Tarheel StateTechnology is evolving and advancing at a dizzying pace across the globe. Emerging technologies are reimagining everything from how we interact with each other to how we interact with businesses and institutions. Given the upward trajectory of technology, it seems that the “innovation” business is ripe for opportunity — an opportunity that appears poised to take off in North Carolina.

In 2021 alone, North Carolina has been the target for some very high-profile technology announcements, including Google’s plans to open a cloud engineering hub in Durham and Apple’s new campus in Research Triangle Park. These exciting developments are now coupled with recent proposed legislation that would create a “regulatory sandbox” further incentivizing technological economic development to expand North Carolina citizens’ access to products and services or unique business models not currently widely available.

A regulatory sandbox allows companies and entrepreneurs to test emerging technologies, products, services, or business models at the leading edge of (or even outside of) an established regulatory framework. Sandboxes have popped up across the country — from Arizona in 2018 to Kentucky, Nevada, Utah, Vermont, and Wyoming in 2019 to Florida and West Virginia in 2020 — as a way of spurring economic growth and breaking down the barriers to market access often faced by creative business models and startups. North Carolina is one of the most recent states to investigate the potential economic benefits of becoming an innovation hub. Although North Carolina tried and failed to implement a sandbox in 2019, the 2021 iteration seems more likely to succeed given the growing number of peer states that have since adopted, or are currently working on, comparable sandbox-creating legislation.

North Carolina’s Regulatory Sandbox Act of 2021 (the “NC Sandbox Act”) seeks to establish a more flexible regulatory environment for the financial services and insurance industries within the state. Here is what you should know about the proposed NC Sandbox Act, which is currently pending before the Committee on Commerce:

Purpose and Applicability

The NC Sandbox Act would permit an applicant to temporarily test an innovative financial product or service, making such product or service available to consumers on a limited basis without subjecting the applicant company to certain licensing or other regulatory obligations otherwise imposed under applicable state law.

The NC Sandbox Act would apply to entities regulated by the Office of Commissioner of Banks or the Department of Insurance and offering a product or service that falls within the definition of an “innovative product or service,” i.e., the entities are using a new or emerging technology, or are providing products, services, business models, or delivery mechanisms not currently widely accessible to the public.

Establishment of North Carolina Innovation Council

To govern the program, the NC Sandbox Act proposes to create an “Innovation Council,” which would be tasked with supporting innovation, investment, and job creation within North Carolina by encouraging participation in the regulatory sandbox. The 11-person council would set standards, principles, guidelines, and policy priorities for the types of innovations that the regulatory sandbox program would support. Interestingly, early analysis of the bill expressly mentions authorizing the Innovation Council to focus on blockchain initiatives (here’s a legislative analysis of SB470). The Innovation Council would also be responsible for approving admission into the regulatory sandbox program.

Innovation Waiver Applications

For $50, an innovator can apply for admission into the regulatory sandbox program. In determining whether to admit an applicant, the Innovation Council will consider:

  1. The nature of the innovation product or service proposed to be made available to consumers, including the potential risk to consumers;
  2. The methods that will be used to protect consumers and resolve complaints during the sandbox period;
  3. The entity’s business plan, including availability of capital;
  4. Whether the entity’s management has the necessary expertise to conduct a pilot of the innovative product or service during the sandbox period;
  5. Whether any person substantially involved in the development, operation, or management of the innovative product or service has been convicted of or is currently under investigation for fraud or state or federal securities violations; and
  6. Any other factor that the Innovation Council or the applicable state agency determines to be relevant.

By tasking the Innovation Council with the responsibility of considering consumer protection in addition to economic growth when evaluating applicant entities, proponents of the legislation seemingly attempt to avoid some of the criticisms that surrounded the 2019 sandbox proposal.

Applicants must also have a physical presence in North Carolina. A waiver of specified requirements imposed by statute or rule may be granted as part of entry into the program and would be valid for the duration of participation in the regulatory sandbox, but typically not to exceed 24 months.

More to Come

The proposed legislation also addresses sandbox program requirements, consumer protections, record requirements, privacy, and other initiative and obligations in more detail. As mentioned above, the legislation is currently pending before the committee, and Bradley is closely monitoring this legislation and will provide continuing coverage of the proposed bill in the coming weeks and months. If passed, the legislation could become effective October 1, 2021.

If you have any questions, please reach out to the authors, Erin Illman or Lyndsay Medlin.

Energy and Infrastructure Companies Need to Know about the DOE’s and Other Agencies’ Focus on CybersecurityOn March 18, 2021, the Department of Energy’s (DOE) Office of Cybersecurity, Energy Security, and Emergency Response (CESER) announced three new research programs that are “designed to safeguard and protect the U.S. energy system” from potential cyberattacks. The DOE also announced a 100-day plan to address cybersecurity risks to the U.S. electric system. Not to be left behind, the Transportation Security Administration (TSA) issued a new security directive in light of the Colonial Pipeline cyberattack. Together, these agency actions demonstrate the scale and intensity of the threat to the energy industry and the focus of the government to curb the threat to our national infrastructure systems. Energy companies should monitor these developments and assess their internal controls to ensure they are cyber-resilient.

The Colonial Pipeline cyberattack surfaced on May 7, 2021, and confronted residents of many Southern states with a real possibility of running out of gas. But, in the days leading up to the ransomware attack, the DOE and the Biden administration were already turning their attention to cyberthreats to the energy industry. The electric system was of special concern, being another piece of critical infrastructure vulnerable to attacks — extensive power interruptions could have devasting consequences. The Colonial Pipeline cyberattack vividly demonstrates that the post-9/11 sensitivity to terrorists’ physical threats must now include cyber threats.

Less than a week after the pipeline restarted, the DOE revealed its three-prong research plan. The research programs will focus on: (1) securing against vulnerabilities in globally sourced technologies; (2) developing solutions to electromagnetic and geomagnetic interference; and (3) cultivating both research on cybersecurity solutions and the new talent needed to deploy it. The emphasis on the supply chain echoes anxieties in the Executive Order on Improving the Nation’s Cybersecurity, with its goals for the security of commercial software.

Importantly, the DOE is attempting to work with the industry. It kicked off its implementation of a 100-day plan — a plan formed by the Biden administration “to enhance the cybersecurity of electric utilities’ industrial control systems (ICS) and secure the energy sector supply chain” — by soliciting input from stakeholders. Through a Request for Information (RFI), the Office of Electricity sought comments from the public on various aspects of the electric infrastructure. When the public-comment period closed on June 7, 2021, nearly 100 entities had submitted comments. The energy industry is fully as interested in these issues as is the government.

Directly responding to the Colonial Pipeline cyberattack, the Department of Homeland Security (DHS) — through the TSA — issued Security Directive Pipeline-2021-01, aimed at tightening its control of pipelines’ security. The directive requires that critical pipeline operators (1) report cyber incidents; (2) designate a Cybersecurity Coordinator; and (3) assess, remediate, and report their cybersecurity measures. Failures to correct deficiencies or to comply with the new rules could result in substantial fines under the TSA’s enabling statute.

Federal agencies and the Biden administration are giving strong, coordinated signals that — as a result of cyber threats and attacks — lax standards, minimal enforcement, and carrots for compliance are things of the past. However, the large number of agencies and divisions with enforcement powers could make compliance confusing and difficult — especially if different critical infrastructure industries are subject to different standards. As a result, infrastructure and energy companies should take action now to harden their security measures. Best practices will help mitigate not only government scrutiny, but also the threat of an attack.

Executive Order on Cybersecurity Sets Aggressive TimelineThe Colonial Pipeline cyberattack prompted the issuance of a long-awaited executive order (EO) on improving U.S. cybersecurity. The EO mandates that, within six months, all federal agencies implement multi-factor authentication (MFA) and both at-rest and in-transit encryption. It also calls for agencies to comprehensively log, share, and analyze information about cyber incidents and creates a Cyber Safety Review Board to that end. The EO sets deadlines for agencies to write guidelines for securing software and detecting threats.

Bradley has authored prior articles and alerts regarding the U.S. governments’ increasing attention to cybersecurity — including at the Department of Defense, federal government as a whole, and even at the state level. With its focus on timelines and deadlines, this EO emphasizes the urgency of improving cybersecurity across industries.

Three goals, with a focus on timing

In a press call, the White House highlighted three goals of the EO:

  • Protect federal networks with specific tools, such as encryption, MFA, endpoint detection and response (EDR), logging, and Zero Trust Architecture.
  • Improve the security of commercial software by establishing security requirements; by using the power of the purse to prime the market for secure software; and by labelling consumer products with a cybersecurity grade.
  • Pool agencies’ information about incidents and enhance incident responses, including through a Cyber Incident Review Board (modelled on the national board that investigates plane crashes).

Reflecting the urgency of better cybersecurity, the EO sets clear, tight deadlines — more than 40 of them. The earliest deadline is set only 14 days after the EO’s release. More than 15 agencies — including the Office of Management and Budget, the Attorney General, the DoD, CISA, and NIST — are tasked with specific responsibilities to write, implement, or enforce the new measures.

Outline of the executive order

The Biden administration’s stated policy is that cybersecurity is a “top priority and essential to national and economic security.” To that end the provisions of the EO apply to “all Federal Information Systems.”

The EO specifically addresses the following issues:

  • Removing barriers to sharing threat information. The White House’s fact sheet uses the phrase “sharing between government and the private sector.” This section aims to expand the requirements on the private sector to provide incident information to the government. To that end, the EO calls for revision of both the FAR and DFARS reporting requirements. Defense contractors are already familiar with the DFARS requirement to “rapidly report” cyber incidents within 72 hours. New requirements may require less rapid reporting for less sensitive incidents.

 

  • Modernizing federal government cybersecurity. This section mandates specific security requirements. Before November 8, 2021, all federal agencies must implement MFA and encryption. Additionally, the EO sets a timeline for adoption of more secure cloud services and for government-wide Zero Trust Architecture. Importantly, this section repeats that the administration’s policy of “protecting privacy and civil liberties” is in tension with modern cybersecurity.

 

  • Enhancing software supply chain security. As chartered, NIST will shoulder the burden for establishing baseline security standards for software, including defining “critical software” and secure procedures for software development. One important component will be providing a Software Bill of Materials (SBOM), which is a record of the details and supply-chain relationships of components used to build software. The SBOM is similar to a list of ingredients on food packaging. It will allow tracking of open-source and other third-party components through a supply chain so that risks can be more easily evaluated — and patched. A second important component is a “consumer labeling program” similar to Singapore’s, for grading the cybersecurity of IoT devices.

 

  • Establishing a Cyber Safety Review Board. When a plane crashes, the National Transportation Safety Board investigates and makes recommendations to improve the safety of air transportation. There is no similar body for reviewing cyber incidents. The EO mandates that the Department of Homeland Security (DHS) establish just such a board, with both government and private-sector representatives having seats at the table. A senior administration official explained that the board’s first task will be to review the SolarWinds incident.

 

  • Standardizing the federal government’s playbook. The EO calls for creation of a “playbook” for agencies to use in responding to cybersecurity vulnerabilities and incidents. Recognizing that some such guidance has been in place for many years, the EO expressly requires that the guidance “incorporate all appropriate NIST standards.”

 

  • Improving detection of vulnerabilities and incidents. Agencies are called to actively hunt for threats and vulnerabilities. Each agency must submit its plan to CISA for a Continuous Diagnostics and Mitigation Program. This program has been around since 2012. The EO seeks to enhance threat-hunting activities and deployment of other Endpoint Detection and Response (EDR) initiatives.

 

  • Improving investigative and remediation capabilities. The very earliest deadline set by the EO is May 26 for DHS to recommend requirements for logging events and retaining other relevant incident data. The EO invites the FAR Council to consider the recommendations in its revision of the FAR and the DFARS reporting requirements.

What this means for industry

Much of the EO mandates actions by government agencies. But it does create action items for private entities. Above all, government contractors should watch for impactful changes to FAR and DFARS cybersecurity clauses. These have been revised multiple times recently, and we expect the Biden administration to revise them again — especially amid ongoing delays of the CMMC rollout. Software developers should begin inventorying their products and preparing SBOMs, especially for those in use by government agencies. Manufacturers of IoT devices should also expect that their devices must soon bear a label that marks their security level. Market forces may encourage production of higher-security devices.

Contact Andrew Tuggle, David Vance Lucas, or Sarah Sutton Osborne with any questions about the order’s impact on your business.

Circuit Split No More: 2nd Circuit Clarifies Article III Standing in Data Breach CasesWhile more states push forward on new privacy legislation statutorily granting consumers the right to litigate control of their personal information, federal courts continue to ponder how data breach injury fits traditional standing requirements. Previous to McMorris v. Carlos Lopez, McMorris v. Carlos Lopez & Assocs., LLC, many have argued there was a circuit split regarding whether an increased risk of identity theft resulting from a data breach is sufficient to establish Article III standing. However, in McMorris, the Second Circuit denied any confusion among its sister courts. Rather, the Second Circuit interestingly held that all courts have technically allowed for the possibility that an increased risk of identity theft could establish standing, but no plaintiff has yet hit the mark. Despite implying that standing could hypothetically exist in certain cases, however, the Second Circuit nonetheless found that McMorris fell short.

Devonne McMorris was an employee at a veteran’s health services clinic, Carlos Lopez & Associates LLP (CLA). In 2018, a CLA employee mistakenly sent an email to other CLA employees containing a spreadsheet with sensitive personally identifiable information (PII), including, but not limited to, Social Security numbers, home addresses, and dates of birth of McMorris and over 100 other CLA employees. McMorris and other class-action plaintiffs filed suit claiming that this purported breach caused them to cancel credit cards, purchase credit monitoring and identity theft protection services, and assess whether to apply for new Social Security numbers. The class-action plaintiffs reached a settlement with CLA, but when sent to the district court for approval, the United States District Court for the Southern District of New York rejected the parties’ agreement for lack of Article III standing. Only McMorris appealed to the Second Circuit.

The Holding

After reviewing recent decisions delivered by other circuits regarding standing and an increased risk for identity theft, the Second Circuit denied the existence of  a circuit split, stating “[i]n actuality, no court of appeals has explicitly foreclosed plaintiffs from establishing standing based on a risk of future identity theft – even those courts that have declined to find standing on the facts of a particular case.”

In deciding the present case, as a case of first impression, the Second Circuit unequivocally held that an increased risk of identity theft could be enough to establish standing, but only under the right circumstances. The Second Circuit set forth a non-exhaustive list of factors to consider:

  1. Whether the plaintiff’s data has been exposed as the result of a targeted attempt to obtain that data (which would make future harm more likely);
  2. Whether any portion of the dataset has already been misused, even if the plaintiffs themselves have not yet experienced identity theft or fraud; and
  3. Whether the type of data that has been exposed is of such a sensitive nature that the risk of identity theft or fraud is heightened.

Despite the foregoing encouragement to would-be plaintiffs, the Second Circuit then struck a blow, holding that self-created damages, in the form of proactive steps to acquire protection from future harm post-data breach, such as purchasing credit monitoring, does not establish an injury in fact. Because there was no evidence of further dissemination of the PII and McMorris’ data was not exposed as a result of a targeted hacking attempt, thereby making future harm hypothetical, McMorris lacked Article III standing. Although the data was sensitive, the court stated “[t]he sensitive nature of McMorris’s internally disclosed PII, by itself, does not demonstrate that she is at substantial risk of future identity theft or fraud.”

McMorris has large implications for both companies and victims of data breaches because the Second Circuit made sweeping proclamations about the national state of the law of standing for data breach victims. Although the refusal to recognize credit monitoring as indicia of future harm may make it difficult for would-be plaintiffs to prove heightened risk and establish standing, the Second Circuit has nonetheless created a hypothetical roadmap for doing so in an area of the law that has been analogized to the Wild West. Notably, the roadmap enumerated by the court seems to encompass the “risk of harm” analysis used by several states, namely, that if data is accessed or acquired by an unauthorized party, it is still not a data breach if there is no risk of harm to the data subject. With this in mind, companies should review their policies and procedures regarding the prevention of and reaction to data breaches. With appropriate prevention and monitoring tools, the chance of a successful “targeted attempt to obtain data,” which could result in lawsuits, is decreased. Moreover, procedures, such as encryption of sensitive data, lower the likelihood that stolen data has “a high risk for identity theft or fraud.”

Contact Lissette Payne or Lyndsay Medlin with any questions or to discuss the impact of this case. For other updates and alerts regarding data breach liability, subscribe to Bradley’s privacy blog, Online and On Point.

Florida Legislature Considers Sweeping Data-Privacy Legislation Supported by GovernorFlorida has joined the wave of states considering new comprehensive data privacy legislation. On February 15, 2021, Rep. Fiona McFarland introduced HB 969, modeled after the California Consumer Privacy Act (CCPA). The bill is supported by Gov. Ron DeSantis and the speaker of the Florida House. As introduced, HB 969 would apply to for-profit businesses that either have annual gross revenues exceeding $25 million, annually buy, sell or receive the personal information of at least 50,000 consumers or derive at least 50% of its annual global revenues from selling or sharing consumers’ personal information. A Senate version of a similar bill (SB 1734) introduced by Republican Sen. Jennifer Bradley passed through its first committee earlier this week.

Both bills impose a number of requirements on covered entities relating to consumers’ personal information – for example, entities must maintain an online privacy policy and update it annually, provide notice at the point of collection, respond to consumers’ requests for copies of their personal information or to correct such information or delete it under certain circumstances. Covered entities also must provide consumers with the right to opt out of sharing personal information, and they are prohibited from discriminating against those who choose to do so. The bills also go a step further than what is required under CCPA and include additional business obligations, such as data retention and limited use requirements.

The companion bills also provide consumers with numerous rights regarding their collected personal information, including the right to request that a business provide a copy of their personal information collected, the right to have their personal information be deleted by covered entities, and the right to have inaccurate personal data corrected.

Like the CCPA, the Florida bills provide a private cause of action against a business if there is a data breach. Similarly, the private right of action is limited to only certain data breaches. A consumer could sue a business if their nonencrypted and nonredacted personal information was stolen in a data breach as a result of the business’s failure to maintain reasonable security procedures and practices to protect it. If this happens, the consumer can sue for the amount of monetary damages actually suffered from the breach or up to $750 per incident.

For all other violations, only the Florida Department of Legal Affairs can file an action. If the department has reason to believe that any business is in violation and that proceedings would be in the public interest, the department may bring an action against such business and may seek a civil penalty of not more than $2,500 for each unintentional violation or $7,500 for each intentional violation. Such fines may be tripled if the violation involves a consumer who is sixteen years of age or younger. A business may be found to be in violation if it fails to cure any alleged violation within 30 days after being notified in writing by the department of the alleged noncompliance.

In their current form, if passed, both bills have an effective date of January 1, 2022. The legislation has been assigned to the Commerce Committee and the Civil Justice and Property Rights subcommittees. The bill has already received a favorable recommendation from the Regulatory Reform subcommittee. The companion Senate bill is also pending in committee. With the support of the governor and the speaker of the house, there is a strong possibility that some form of legislation will pass. Stay tuned for further updates and alerts from Bradley on state privacy law developments and obligations by subscribing to Bradley’s privacy blog, Online and OnPoint.

Privacy Litigation Updates for the Financial Services Sector: Claims Against Yodlee Survive and Limited Discovery of Envestnet AllowedIn November 2020, Yodlee and its parent company Envestnet filed separate motions to dismiss the class action lawsuit brought over Yodlee’s alleged data collection and use practices. Yodlee’s motion to dismiss argued that plaintiffs failed to state a claim under Federal Rule of Civil Procedure 12(b)(6), while Envestnet argued that its status as the parent company to Yodlee was not enough for the court to establish personal jurisdiction over Envestnet under Federal Rule of Civil Procedure 12(b)(2).

On February 16, 2021, Federal Magistrate Judge Sallie Kim partially granted and partially denied Yodlee’s motion to dismiss and reserved ruling on Envestnet’s motion to dismiss. The court allowed plaintiffs to cure deficiencies and file an amended complaint. On March 15, 2021, plaintiffs filed a Second Amended Complaint.

Yodlee’s Motion to Dismiss

Claims 1 and 10 – Invasion of Privacy:

The court held that plaintiffs have a reasonable expectation of privacy in their individual financial accounts. Yodlee is alleged to have improperly accessed and retained data from these personal accounts. Furthermore, Yodlee is alleged to have sold aggregated financial data that “would only take a few steps to identify the individual.”

The court denied Yodlee’s motion to dismiss Claims 1 and 10.

Claim 2 – Stored Communications Act:

The court held that plaintiffs failed to allege facts sufficient to satisfy the element of “electronic storage” because plaintiffs only alleged Yodlee “stores the information for its own misuse of the data.”

The court granted Yodlee’s motion to dismiss Claim 2 with leave to amend.

Claim 3 – Unjust Enrichment:

The court held that plaintiffs’ allegations of acquiring their data through a fraudulent scheme and selling that data was pled with enough particularity to put Yodlee on notice of the substance of the alleged fraudulent scheme.

The court denied Yodlee’s motion to dismiss Claim 3.

Claim 4 – California Civil Code § 1709:

The court found that plaintiffs sufficiently alleged Yodlee’s alleged fraudulent scheme to deceive plaintiffs.

The court denied Yodlee’s motion to dismiss Claim 4.

Claim 5 – California Unfair Competition Law – Business and Professional Code § 17200:

The court held that plaintiffs did not allege “a transaction or contract with Yodlee,” only the “Loss of Benefit of the Bargain,” and as such, it is unclear how plaintiffs “lost money or property as a result of Yodlee’s alleged conduct.” Furthermore, although plaintiffs allege the inability to seek indemnification and the heightened risk of identity theft, the court held that since neither of these have occurred yet, they are merely potential and hypothetical and not enough to have standing to bring suit over this cause of action.

The court granted Yodlee’s motion to dismiss Claim 5 with leave to amend.

Claims 7 and 9 – Computer Fraud and Abuse Act and California Comprehensive Data Access and Fraud Act:

The court held that plaintiffs’ damage claims of “the costs of conducting damage assessments, restoring the data to its condition prior to the offense, and consequential damages they incurred by, inter alia, spending time conducting research to ensure that their identity had not been compromised and accounts reflect the proper balances” were conclusory and insufficient to show damage or loss.

The court granted Yodlee’s motion to dismiss Claims 7 and 9 with leave to amend.

Claim 8 – California Anti-Phishing Act of 2005:

The court held that plaintiffs’ allegations that Yodlee represented themselves to be plaintiffs’ financial institutions, which was an allegedly fraudulent and deceitful impersonation of those institutions, and induced plaintiffs to provide their login credentials to defendants, were sufficient to state a claim under the California Anti-Phishing Act.

The court denied Yodlee’s motion to dismiss Claim 8.

Envestnet’s Motion to Dismiss for Lack of Personal Jurisdiction

The court held that plaintiffs have not alleged sufficient facts to bring an alter ego claim against Envestnet. The court noted that an alter ego claim is a rare remedy. To be invoked, the court held that there must be (1) unity of interest and (2) an inequitable result will occur if not invoked. To show unity of interest, plaintiffs should plead a fact supporting at least two or three of the following factors: “commingling of funds, identification of the equitable owners with domination and control of the two entities, instrumentality or conduit for a single venture or the business of an individual, failure to maintain minutes or adequate corporate records, use of the same office or business locations, identical equitable ownership of the two entities, use of a corporation as a mere shell, and the failure to adequately capitalize a corporation.” Furthermore, in some jurisdictions, such as the present jurisdiction, a showing of bad faith is required.

The court noted that, as it stands, plaintiffs have not alleged sufficient facts to support their alter ego claim. However, the court reserved ruling on Envestnet’s motion to dismiss until plaintiffs have an opportunity to conduct discovery on the issue. The court provided plaintiffs the opportunity to issue five document requests, five interrogatories, and five requests for admissions, as well as take one deposition of Envestnet. Plaintiffs must then file a supplemental brief no later than May 28, 2021, and Envestnet may file a response by June 11, 2021.

Takeaway

Many of plaintiffs’ claims have survived the motion to dismiss, bringing to light the legal and reputational risks from these data-sharing practices. Considering this pending case, businesses should review their privacy policies and procedures to ensure their data privacy compliance programs are up to date, accurately disclose their sharing practices, and protect consumer data. Based on this order, there are two significant areas to watch: anonymized, aggregated data and application programming interface (API) interactions.

Anonymized, Aggregated Data

The court found that plaintiffs have a reasonable expectation of privacy in their personal, financial accounts at an individual level. Though Yodlee argued that plaintiffs do not have a reasonable expectation of privacy in anonymized, aggregated data, the court noted that plaintiffs’ allegations that it “would only take a few steps to identify the individual Plaintiffs from the transactions.”

All businesses should review their contracts with third-party service providers, including those that provide APIs, to ensure that contractual language defining anonymized, aggregated data complies with relevant privacy laws and provides required protections, as well as defines whether and to what extent the business grants the third party permission to use and further disclose such anonymized, aggregated data.

API Interactions

Many of plaintiffs’ claims were based on the lack of and/or unclear disclosure of Yodlee’s interactions with their financial institutions. While plaintiffs allege that Yodlee does not have authority or approval from each financial institution, the use of a login screen that appears to be the financial institution is likely part of the API software agreement that the financial institutions pay to use. Businesses should ensure that any interaction with third-party processors on their websites or applications clearly and explicitly states the role of the third party and that such role is properly reflected in the businesses’ privacy policies.

If you have any questions or to discuss your company’s data sharing practices, contact Courtney Achee, Lissette Payne or Kelley Hails. For more information on this developing case and other updates and alerts regarding privacy law developments, subscribe to Bradley’s privacy blog Online and On Point.

Critical Changes for U.S. Cleared FacilitiesCodification of the NISPOM and replacement of JPAS

Two significant changes are underway by the Defense Counterintelligence and Security Agency (DCSA) – both of which require the immediate attention of businesses that hold a U.S. security clearance or are in the process of application for a clearance.

The first change is the codification of the National Industrial Security Program Operating Manual (NISPOM). As background, the NISPOM has been the key guidance for protecting classified and certain other controlled information in accordance with the National Industrial Security Program (NISP) as currently overseen by the DCSA.

The Department of Defense (DoD) published a Final Rule codifying the NISPOM on December 21, 2020, which became effective February 24, 2021. The Final Rule requires that contractors must implement changes no later than six months after the date the Final Rule is published (August 2021). The NISPOM is now codified at 32 CFR Part 117. Further guidance on the Final Rule’s implementation will be published in an Industrial Security Letter (ISL).

The Final Rule establishes requirements, policies and procedures in accordance with the NISP – which outlines the protection of classified information that is disclosed to, or developed by contractors, licensees, grantees, or certificate holders to prevent unauthorized disclosure.

Key changes include:

  • Requirements for cleared contractors to submit reports pursuant to Security Executive Agent Directive (SEAD) 3 and cognizant security agency (CSA) guidance.
  • An additional facility clearance tool for DCSA and government contracting activities (GCAs) as a limited entity eligibility specific to the requesting GCA’s classified information, and to a single, narrowly defined contract, agreement, or circumstance.
  • Elimination of the requirement for National Interest Determinations (NIDs) for certain covered contractors operating under a special security agreement with ownership by countries designated as part of the National Technology and Industrial Base (United Kingdom, Canada or Australia).
  • Determinations by a CSA with respect to requirements for top secret accountability.
  • Permitting Intrusion Detection System (IDS) installation and UL-2050 certification by an Occupational Safety and Health Administration (OSHA) Nationally Recognized Testing Laboratory (NRTL).
  • Directing cleared contractors to refer to 32 CFR Part 2001 for guidance on requirements for the protection of classified national security information (CNSI) to ensure consistency with national policy.
  • Clarification of responsibilities for Senior Management Officials (SMO).
  • Clarification that upon completion of a classified contract, the contractor must return all government provided or deliverable information to the custody of the government.

Significantly, for non-U.S. entities, the Final Rule also eliminates the requirement that entities under Foreign Ownership, Control, or Influence (FOCI) operating under a Special Security Agreement (SSA) attain a NID. Under the Final Rule, SSA entities covered by the NITB will be permitted to begin contract performance without first obtaining a NID.

The DCSA is currently reviewing existing ISLs to determine those that will be retained, re-issued, and/or rescinded. DCSA is also working on revisions to NISP-related forms, including the SF-328 – “Certificate Pertaining to Foreig-n Interests,” DD Form 441 – “Security Agreement,” and DD Form 441-1 – “Security Agreement Appendage.”

The second significant change by the DCSA is the retirement of the Joint Personnel Adjudications System (JPAS). JPAS is being replaced by the Defense Information Security System (DISS) as the next step toward deployment of the National Background Investigation Services (NBIS) and implementation of the Trusted Workforce 2.0 continuous vetting policy.

JPAS transitioned to a read-only mode on March 15, 2021, and will be fully retired on March 31, 2021. All updates to eligibility, access, and visit data must be completed in DISS prior to March 15.

Cleared contractors should work with their SMO, DCSA representative and counsel to assure their understanding and compliance with these significant changes. For other updates and alerts regarding national security law developments, subscribe to Bradley’s privacy blog Online and On Point.

A Second Chance for the Public Health Emergency Privacy ActThis is the seventh alert in a series of Bradley installments on privacy and cybersecurity developments arising from the COVID-19 pandemic. Click to read the first, second, third, fourth, fifth, and sixth installments.

Sen. Mark Warner (D-Va.) has re-introduced a bill to create the Public Health Emergency Privacy Act (PHEPA). First introduced in May 2020, the bill died in committee. This time, Warner is joined by 11 cosponsors in the Senate and by 32 sponsors of a related bill in the House of Representatives.

This newly introduced bill is identical to the earlier version, which we reported on at the time. PHEPA would have the usual notice-and-consent backbone, requiring affirmative consent from a consumer before a covered organization could collect, use, or disclose his or her emergency health data. Organizations collecting the data would need to protect it with reasonable security and not use the data for any purposes beyond those expressly identified in a privacy policy.

No preemption

Two controversial aspects of PHEPA bear repeating. First, PHEPA would expressly not preempt state laws. That would effectively make PHEPA a floor that states could raise either by existing legislation or with new legislation. For organizations doing business in multiple states, this could result in having to comply with higher standards than created by the federal bill, at least in some states.

Private right of action

Second, PHEPA would provide a private right of action to consumers. In addition to enforcement by FTC and by states’ attorneys general, under PHEPA, affected consumers could sue directly for statutory damages of up to $5,000 per violation. Consumers could also recover attorneys’ fees and litigation costs.

Work in progress

When first introduced last year, the bill competed with a bill from Sen. Roger Wicker (R-Miss.) and others to create the “COVID-19 Consumer Data Protection Act of 2020.” The competing bill had fewer protections, express preemption, and no private right of action. Both bills died in committee.

The Wicker bill has not yet been reintroduced, and the Warner bill does not yet have bipartisan support. So, it remains to be seen when, how, and even if the federal government will create data privacy protections — either related to the COVID-19 pandemic or more generally. We will continue to update you as we learn more.