CMMC 2.0 – Simplification and Flexibility of DoD Cybersecurity Requirements

Continuing Effort to Protect National Security Data and NetworksEvolving and increasing threats to U.S. defense data and national security networks have necessitated changes and refinements to U.S. regulatory requirements intended to protect such.

In 2016, the U.S. Department of Defense (DoD) issued a Defense Federal Acquisition Regulation Supplement (DFARs) intended to better protect defense data and networks. In 2017, DoD began issuing a series of memoranda to further enhance protection of defense data and networks via Cybersecurity Maturity Model Certification (CMMC). In December 2019, the Department of State, Directorate of Defense Trade Controls (DDTC) issued long-awaited guidance in part governing the minimum encryption requirements for storage, transport and/or transmission of controlled but unclassified information (CUI) and technical defense information (TDI) otherwise restricted by ITAR.

DFARs initiated the government’s efforts to protect national security data and networks by implementing specific NIST cyber requirements for all DoD contractors with access to CUI, TDI or a DoD network. DFARs was self-compliant in nature.

CMMC provided a broad framework to enhance cybersecurity protection for the Defense Industrial Base (DIB). CMMC proposed a verification program to ensure that NIST-compliant cybersecurity protections were in place to protect CUI and TDI that reside on DoD and DoD contractors’ networks. Unlike DFARs, CMMC initially required certification of compliance by an independent cybersecurity expert.

The DoD has announced an updated cybersecurity framework, referred to as CMMC 2.0. The announcement comes after a months-long internal review of the proposed CMMC framework. It still could take nine to 24 months for the final rule to take shape. But for now, CMMC 2.0 promises to be simpler to understand and easier to comply with.

Three Goals of CMMC 2.0

Broadly, CMMC 2.0 is similar to the earlier-proposed framework. Familiar elements include a tiered model, required assessments, and contractual implementation. But the new framework is intended to facilitate three goals identified by DoD’s internal review.

  • Simplify the CMMC standard and provide additional clarity on cybersecurity regulations, policy, and contracting requirements.
  • Focus on the most advanced cybersecurity standards and third-party assessment requirements for companies supporting the highest priority programs.
  • Increase DoD oversight of professional and ethical standards in the assessment ecosystem.

Key Changes under CMMC 2.0

The most impactful changes of CMMC 2.0 are

  • A reduction from five to three security levels.
  • Reduced requirements for third-party certifications.
  • Allowances for plans of actions and milestones (POA&Ms).

CMMC 2.0 has only three levels of cybersecurity

An innovative feature of CMMC 1.0 had been the five-tiered model that tailored a contractor’s cybersecurity requirements according to the type and sensitivity of the information it would handle. CMMC 2.0 keeps this model, but eliminates the two “transitional” levels in order to reduce the total number of security levels to three. This change also makes it easier to predict which level will apply to a given contractor. At this time, it appears that:

  • Level 1 (Foundational) will apply to federal contract information (FCI) and will be similar to the old first level;
  • Level 2 (Advanced) will apply to controlled unclassified information (CUI) and will mirror NIST SP 800-171 (similar to, but simpler than, the old third level); and
  • Level 3 (Expert) will apply to more sensitive CUI and will be partly based on NIST SP 800-172 (possibly similar to the old fifth level).

Significantly, CMMC 2.0 focuses on cybersecurity practices, eliminating the few so-called “maturity processes” that had baffled many DoD contractors.

CMMC 2.0 relieves many certification requirements

Another feature of CMMC 1.0 had been the requirement that all DoD contractors undergo third-party assessment and certification. CMMC 2.0 is much less ambitious and allows Level 1 contractors — and even a subset of Level 2 contractors — to conduct only an annual self-assessment. It is worth noting that a subset of Level 2 contractors — those having “critical national security information” — will still be required to seek triennial third-party certification.

CMMC 2.0 reinstitutes POA&Ms

An initial objective of CMMC 1.0 had been that — by October 2025 — contractual requirements would be fully implemented by DoD contractors. There was no option for partial compliance. CMMC 2.0 reinstitutes a regime that will be familiar to many, by allowing for submission of Plans of Actions and Milestones (POA&Ms). The DoD still intends to specify a baseline number of non-negotiable requirements. But a remaining subset will be addressable by a POA&M with clearly defined timelines. The announced framework even contemplates waivers “to exclude CMMC requirements from acquisitions for select mission-critical requirements.”

Operational takeaways for the defense industrial base

For many DoD contractors, CMMC 2.0 will not significantly impact their required cybersecurity practices — for FCI, focus on basic cyber hygiene; and for CUI, focus on NIST SP 800-171. But the new CMMC 2.0 framework dramatically reduces the number of DoD contractors that will need third-party assessments. It could also allow contractors to delay full compliance through the use of POA&Ms beyond 2025.

Increased Risk of Enforcement

Regardless of the proposed simplicity and flexibility of CMMC 2.0, DoD contractors need to remain vigilant to meet their respective CMMC 2.0 level cybersecurity obligations.

Immediately preceding the CMMC 2.0 announcement, the U.S. Department of Justice (DOJ) announced a new Civil Cyber-Fraud Initiative on October 6 to combat emerging cyber threats to the security of sensitive information and critical systems. In its announcement, the DOJ advised that it would pursue government contractors who fail to follow required cybersecurity standards.

As Bradley has previously reported in more detail, the DOJ plans to utilize the False Claims Act to pursue cybersecurity-related fraud by government contractors or involving government programs, where entities or individuals, put U.S. information or systems at risk by knowingly:

  • Providing deficient cybersecurity products or services
  • Misrepresenting their cybersecurity practices or protocols, or
  • Violating obligations to monitor and report cybersecurity incidents and breaches.

The DOJ also expressed their intent to work closely on the initiative with other federal agencies, subject matter experts and its law enforcement partners throughout the government.

As a result, while CMMC 2.0 will provide some simplicity and flexibility in implementation and operations, U.S. government contractors need to be mindful of their cybersecurity obligations to avoid new heightened enforcement risks.

Contact Andrew Tuggle or David Vance Lucas with any questions about the impact of CMMC 2.0 on your business.

FTC Finalizes Updated Safeguards Rule Under GLBA to Dramatically Expand Data Security Requirements and Scope of RuleUntil now, companies primarily regulated by the Federal Trade Commission (FTC) were given only vague directives to implement systems sufficient to safeguard customer data, coupled with FTC “recommendations” as to best practices. That is about to change with the FTC’s finalization of its proposed amendments to the Standards for Safeguarding Customer Information (Safeguards Rule) on October 27. The new requirements will become effective one year after the rule is published in the Federal Register, so companies should start planning for compliance now to avoid fire drills down the road.

The new Safeguards Rule is more aligned with the requirements imposed by the Federal Financial Institutions Examination Council (FFIEC) for banking and depository institutions and, in some respects, imposes more burdensome requirements. Companies subject to the FTC’s authority should start prepping now to ensure that their current data security practices and infrastructure — and those of their service providers — will survive FTC scrutiny.

Who Is Covered by the Amended Safeguards Rule?

The FTC’s jurisdiction applies to a surprisingly broad range of businesses. This updated rule applies to entities traditionally within the FTC’s jurisdiction for rulemaking and enforcement, which include non-banking (non-depository) institutions such as mortgage brokers, mortgage servicers, payday lenders, and other similar entities.

But the FTC’s jurisdiction does not end there, and in fact, the rule’s definition now encompasses companies that never traditionally would be considered “financial institutions.” For example, the scope of the new rule now broadly applies to businesses that bring together buyers and sellers of a product or service, potentially drawing in companies of all shapes and sizes, such as marketing companies. Furthermore, the FTC has previously determined that higher education institutions also fall within the definition of “financial institutions,” and thus are subject to the rule’s requirements, because higher education institutions participate in financial activities, such as making federal student loans.

Businesses operating in any of these industries should take note of these important changes and determine whether sweeping changes to existing information security programs are necessary.

What Are the Biggest Takeaways?

There are a number of changes contained in the final Safeguards Rule, but the biggest takeaways are:

  1. The definition of “financial institution” has been dramatically expanded to include “finders,” or entities that bring together buyers and sellers of any product or service for transactions;
  2. More specific requirements for information security programs were imposed, including encryption for data both in transit and at rest;
  3. A new small business exception to the Safeguards Rule’s requirements was added;
  4. Mandatory requirements for risk assessments (which now must be set forth in writing) were established; and
  5. Required periodic reporting on information security programs to boards of directors or governing bodies was implemented.

We will explore some of these noteworthy changes in greater detail.

Expanded Definition of “Financial Institution”

The definition of “financial institution” has now been expanded to include a “finder,” which is defined as a company that “bring[s] together one or more buyers and sellers of any product or service for transactions that the parties themselves negotiate and consummate.” To support this change, the FTC reasoned that finders often possess sensitive consumer financial information and should be subject to the requirements of the Safeguards Rule in protecting it from unauthorized disclosure.

Commenters on the proposed rule expressed grave concerns that the term “finder” is overly broad and would inappropriately sweep large swaths of companies into the definition of a “financial institution.” The Association of National Advertisers also expressed concern that advertisers could constitute “finders” under this definition due to their role in connecting buyers and sellers. However, the FTC noted that the although the definition is broad, its scope is significantly limited because the Safeguards Rule applies only to the information of customers and “it will not apply to finders that have only isolated interactions with consumers and that do not receive information from other financial institutions about those institutions’ customers.” It remains to be seen how broadly the term “finder” will be construed, and this could prove to be one of the biggest question marks of the scope of the new rule.

Specific Data Protection Requirements

Whereas the FTC previously left specific aspects of satisfactory information security systems up to the discretion of the business, the FTC now requires that financial institutions address the following:

  • Access controls;
  • Data inventory and classification;
  • Encryption;
  • Secure development practices;
  • Authentication;
  • Information disposal procedures;
  • Change management;
  • Testing; and
  • Incident response.

The biggest takeaway here is that the FTC is imposing more specific requirements, such as encryption, for the protection of sensitive customer information, whereas the previous Safeguards Rule allowed financial institutions to exercise discretion by referring to data protection in generalities. In addition, the rule’s encryption requirements, which include encrypting data both in transit and at rest, are more burdensome than the FFIEC’s proposed guidelines, which do not require banks to encrypt data at rest unless the institution’s risk assessment determines that such encryption is necessary.

Mandatory Periodic Reporting

The new rule now requires that a financial institution’s chief information security officer must now report in writing, at least annually, to the financial institution’s board of directors or governing body regarding the following:

  • The overall status of the information security program and financial institution’s compliance with the Safeguards Rule; and
  • Material matters related to the information security program, addressing issues such as risk assessment, risk management and control decisions, service provider arrangements, results of testing, security events or violations and management’s responses thereto, and recommendations for changes in the information security program.

If the company does not have a board of directors or equivalent governing body, the chief information security officer must “make the report to a senior officer responsible for the financial institution’s information security program.”

Small Business Exemption

The new rule adds a “small business” exemption, which excludes businesses that “maintain customer information concerning” fewer than 5,000 consumers from the following requirements of the new rule:

  • Requiring a written risk assessment (314.4(b)(1));
  • Requiring continuous monitoring or annual penetration testing and biannual vulnerability assessment (314.4(d)(2));
  • Requiring a written incident response plan (314.4(h)); and
  • Requiring an annual written report by the chief information security officer (314.4(i)).

It remains to be seen, however, how many businesses will be able to take advantage of this exception as a practical matter, especially for businesses that are required to maintain consumer records for a certain number of years.

Other Noteworthy Changes

In addition to the important changes outlined above, there are also several other important changes to note. Although not intended to be exhaustive, the list of other changes include:

  • The addition of a definition for “authorized user,” which means “any employee, contractor, agent, customer, or other person that is authorized to access any of your information systems or data.” This term was added in conjunction with specific data access restriction requirements and more specific requirements for monitoring anomalous patterns of usage by “authorized users.”
  • The definition of “security event” now includes the compromise of customer information in physical form, as opposed to only electronic form.
  • The new rule imposes mandatory requirements for risk assessments (which now must be set forth in writing). Risk assessments were already required, but requirements are now more explicit.


In light of these updates, financial institutions should review their policies and procedures, as well as their contracts with service providers, to ensure that all security information systems comply with the new, detailed security requirements of the amended Safeguards Rule. As always, an ounce of prevention on the front end will highly reduce the risk of an FTC enforcement action or consumer litigation down the line.

For more information on this issue and other updates and alerts regarding privacy law developments, subscribe to Bradley’s privacy blog, Online and On Point.

A Fintech Leader’s Thoughts on the North Carolina Regulatory Sandbox Act

As part of Bradley’s continuing coverage of the North Carolina Sandbox Act, we wanted to know what community members and NC fintech aficionados thought about this proposed legislation. We posed six questions to Tariq Bokhari, an influential leader in the financial technology (fintech) industry, who serves as the executive director of the Carolina Fintech Hub (CFH). Read more of our conversation below on how this regulatory sandbox will impact North Carolina’s fintech industry.

Bradley: How will the NC Regulatory Sandbox Act affect fintech companies generally?

Bokhari: The premise behind the NC Regulatory Sandbox Act (Innovation Sandbox) is that innovators and startups in tech 1) have difficulty piloting new ideas in a fail-fast manner due to a regulatory system not designed for that, and 2) are viewed and set up as disruptive forces to incumbent stakeholders, rather than opportunities to partner with those incumbents in a win-win scenario. The Innovation Sandbox is designed to create tools that decrease both of those headwinds that are pervasive across the country, and in doing so create a competitive advantage for our region and all that reside within it. The Carolina Fintech Hub has championed this effort for several years now, and found like-minded partners like the NC Blockchain Initiative, because we strongly believe being the most entrepreneurial and nimble of the 50 states will position us as global leaders in technology and innovation.

Bradley: What products and services are applicable for this program?

Bokhari: Its scope can truly be anything that touches technology, although the initial focus will be on fintech, insurtech and blockchain. I envision this program being expanded after one or two years to include other areas, like possibly securities, thus making the program more comprehensive.

Bradley: With regulatory sandboxes already being set up for other states’ finance and insurance economies, do you see a possible playbook for North Carolina’s fintech industry?

Bokhari: There are a few unique differences with NC’s Innovation Sandbox, including our unique focus on promoting the partnership between our incumbents and startups rather than disruptive friction between them.

Bradley: Can you elaborate on what makes NC’s Innovation Sandbox unique?

Bokhari: With its unique sandbox approach, North Carolina decided to start simple while allowing for natural evolution, namely by embedding formalized accountability around innovation across NC via an Innovation Commission.

This Innovation Commission is designed to be centralized (not embedded in any one state agency), cross-representative (to maximize collaboration), lightweight in its design (very simple in its mandate) and serve as a clearing house of innovation requests and ideas (sourced from the industry with the help of established non-governmental organizations (NGOs)).

The Innovation Commission is really envisioned to have only two major tasks: 1) review the requests of those who apply to participate in any of the sandbox’s available tools, and if it deems the requests to have merit, route them to the appropriate regulating agency or agencies for ultimate decisioning; and 2) review requests to create new tools that enable further innovation, and if it deems the requests to have merit, route them to the appropriate regulating agency or agencies for ultimate decisioning.

Bradley: How will this proposed legislation change economic development in North Carolina?

Bokhari: In its simplest form, this legislation will create an Innovation Commission that will give North Carolina a significant advantage over every other state in recruiting and retaining tech companies. These companies will be able to perform certain activities with reduced governmental red tape here.

Bradley: How does Carolina Fintech Hub plan to help their fintech partners strike that balance between protecting consumers while promoting emerging fintech technologies and innovations in the field?

Bokhari: Defining the tools is the most challenging task for any state to address, so the “secret sauce” in our approach is not trying to assume what set of tools is needed up front. Instead of assuming, we use a platform that can react to the market demands for tools as they are recognized in this formal Innovation Commission structure, while still operating within the confines of the existing regulatory agency construct to avoid unneeded or complicated friction.

The proposed legislation envisions a tool that has also been incorporated in other states’ sandbox efforts to date. This tool would enable small-scale piloting of innovations without having to apply for what may otherwise be cumbersome licenses or having to build out large-scale compliance programs for certain regulatory frameworks.

In addition to the tool described above, there are two additional tools that are in the hopper for near term exploration when the Innovation Commission is established: 1) after successful completion of the centralized sandbox program, a startup receives a limited scope “stamp of recognition” that can provide additional confidence to incumbent banks and institutions and their vendor risk management processes when they contemplate engaging the startup; and 2) a blockchain Innovation Sandbox use case. This still requires significant design and vetting. And the goals here are aspirational: I envision, not only startups that are freestanding entities taking advantage of the Innovation Sandbox, but also startups acquired or established as affiliates or subsidiaries by big financial institutions utilizing the sandbox as well, as the model matures.

Bradley: Do you anticipate the Regulatory Sandbox Act will slow the pace of companies integrating emerging technology, including blockchain technology, in North Carolina’s fintech space? And will the act attract companies to relocate to NC?

Bokhari: I am highly confident this legislation will multiply the pace of innovative, emerging technology across NC, as well as our ability to recruit nationally and internationally, for a simple reason: Companies will be able to operate with less friction, and capitalize on more partnerships with our incumbents in NC, more so than in any other state. I am most interested in the blockchain aspects of the regulation. I am seeing a spike in smart contract and crypto activity lately, but most places across the country don’t even know this activity is happening, let alone have a sophisticated system to champion it statewide.

Bradley is closely monitoring this legislation and will provide continuing coverage of the proposed bill in the coming weeks and months. If passed, the legislation could become effective October 1, 2021.

Another Data Privacy Law? Colorado Enacts the Colorado Privacy ActColorado became the third state to enact comprehensive data privacy legislation when Gov. Jared Polis signed the Colorado Privacy Act (CPA) on July 8, 2021. The CPA shares similarities with its stateside predecessors, the California Consumer Privacy Act (CCPA), the California Privacy Rights Enforcement Act (CPRA), and the Virginia Consumer Data Protection Act (VCDPA), as well as the European Union’s General Data Protection Regulation (GDPR). But the CPA’s nuances must be considered as companies subject to these statutes craft holistic compliance programs.

The CPA goes into effect on July 1, 2023. But, given its complexity, the time for companies to start preparing is now. Here are some answers to questions about the scope of the new law, the consumer rights it provides, the obligations it imposes on businesses, and its enforcement methods.

Does the CPA apply to my business?

The CPA’s jurisdictional scope is most like the VCDPA’s. The CPA applies to any “controller” – defined as an entity that “determines the purposes for and means of processing personal data” – that “conducts business in Colorado” or produces or delivers products or services “intentionally targeted” to Colorado residents and either (1) controls or processes the personal data of 100,000+ “consumers” each calendar year; or (2) controls or processes the personal data of 25,000+ consumers and derives revenue or receives discounts from selling personal data.

“Personal data” is defined as “information that is linked or reasonably linkable to an identified or identifiable individual” other than “publicly available information” or “de-identified data.” The CPA defines a “consumer” as a “Colorado resident acting only in an individual or household context.”

The CPA provides entity-level exemptions to air carriers and national securities associations, among others. Unlike the CCPA, CPRA, and VCDPA, the CPA does not provide an entity-level exemption to non-profit organizations.

How do I handle consumer requests regarding their personal data?

The CPA provides consumers with the right to submit authenticated requests to a controller to (1) opt-out of certain processing of their personal data; (2) access their personal data and confirm if it is being processed; (3) correct inaccuracies in their personal data; (4) delete their personal data; and (5) obtain their personal data in a portable format. A controller must inform the consumer of any actions taken or not taken in response within certain timelines.

Like the VCDPA and unlike the CCPA and CPRA, the CPA provides consumers with the right to appeal a controller’s decision concerning an authenticated request. Controllers must set up internal processes for handling such appeals.

What are a consumer’s opt-out rights?

The CPA provides consumers with the right to opt-out of the processing of personal data for: (1) sale; (2) targeted advertising; or (3) profiling. The final two opt-out rights are also found in the VCDPA and CPRA, but not the CCPA.

Like the CCPA’s definition, the CPA’s definition of the “sale” of personal data is broad: “the exchange of personal data for monetary or other valuable consideration by a controller to a third party.” But the CPA’s exceptions to this definition are much broader. Under the CPA, a controller does not sell personal data by disclosing personal data (1) to an affiliate; (2) to a “processor” that processes the personal data on the controller’s behalf; (3) to a third party for “purposes of providing a product or service” that the consumer requests; (4) that a consumer “directs the controller to disclose or intentionally discloses by using the controller to interact” with a third party; or (5) that the consumer “intentionally made available … to the general public via a channel of mass media.”

When do I have to obtain opt-in consent from a consumer?

The CPA requires that a controller obtain opt-in consent before processing (1) “sensitive” data; (2) the personal data of a “known” child;  or (3) personal data “for purposes that are not reasonably necessary to or compatible with” the processing purposes that the controller previously specified to the consumer. To provide the requisite “consent,” a consumer must make a “clear, affirmative act” that signifies their “freely given, specific, informed, and unambiguous agreement” to the processing.

Importantly, a consumer accepting broad terms of use or a “similar document that contains descriptions of personal data processing along with other, unrelated information” does not constitute consent. Nor does an agreement obtained through “dark patterns,” which the CPA defines as user interfaces “designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.”

What does my privacy notice have to say?

A controller must provide consumers with a privacy notice that is “reasonably accessible, clear, and meaningful.” All privacy notices must include the following information: (1) the categories of personal data collected or processed; (2) the purposes for which personal data are processed; (3) the categories of personal data shared with third parties; (4) the categories of third parties with whom the controller shares personal data; and (5) how a consumer can submit authenticated requests and appeals regarding such requests.

If a controller sells personal data or processes such data for targeted advertising, the privacy notice must “clearly and conspicuously disclose” that fact and how consumers can opt-out. The “opt-out method” also must be provided in a separate location that is “clear, conspicuous, and readily accessible.”

Do I have to perform data protection assessments?

Similar to the VCDPA and GDPR, the CPA requires that controllers conduct and document a “data protection assessment” regarding each of its processing activities that: (1) involves personal data acquired on or after July 1, 2023; and (2) presents a “heightened risk of harm” to a consumer. Processing that presents such a heightened risk includes (1) selling personal data; (2) processing sensitive data; (3) processing personal data for targeted advertising; or (4) processing personal data for profiling that presents a “reasonably foreseeable risk” of certain consumer harms.

Among other requirements, a data protection assessment must “identify and weigh” the benefits of the processing to the “controller, the consumer, other stakeholders, and the public” against “the potential risks to the rights of the consumer,” as “mitigated by safeguards that the controller can employ to reduce the risks.”

What are my data minimization and security requirements?

A controller’s “collection of personal data must be adequate, relevant, and limited to what is reasonably necessary” to the processing’s purposes that have been disclosed to the consumer. As noted above, a controller cannot process for another purpose without the consumer’s consent.

A controller must take “reasonable measures” to secure personal data from “unauthorized acquisition” during storage and use. These data security practices must be appropriate for the “nature” of the controller’s business and the “volume, scope, and nature of the personal data processed.”

How is the CPA enforced?

Unlike the CCPA, the CPA does not provide a private right of action. It is enforceable only by Colorado’s attorney general and district attorneys. CPA violations constitute a deceptive trade practice and are thus subject to civil penalties of up to $20,000 per violation.

Until January 1, 2025, the attorney general or district attorney must provide notice of a violation, which triggers a 60-day cure period. If the controller fails to cure the violation within this period, the attorney general or district attorney may initiate an enforcement action.


While the CPA’s similarities to predecessor privacy statutes will allow companies to leverage their current compliance efforts to obtain CPA compliance, the statute’s enactment nonetheless adds another layer to already onerous data privacy obligations.

Bradley’s Cybersecurity and Privacy team is here to help. Stay tuned for further updates and alerts on privacy-law developments by subscribing to the Online and OnPoint blog.

Technology Boom in NC? What You Should Know About the Proposed Regulatory Sandbox in the Tarheel StateTechnology is evolving and advancing at a dizzying pace across the globe. Emerging technologies are reimagining everything from how we interact with each other to how we interact with businesses and institutions. Given the upward trajectory of technology, it seems that the “innovation” business is ripe for opportunity — an opportunity that appears poised to take off in North Carolina.

In 2021 alone, North Carolina has been the target for some very high-profile technology announcements, including Google’s plans to open a cloud engineering hub in Durham and Apple’s new campus in Research Triangle Park. These exciting developments are now coupled with recent proposed legislation that would create a “regulatory sandbox” further incentivizing technological economic development to expand North Carolina citizens’ access to products and services or unique business models not currently widely available.

A regulatory sandbox allows companies and entrepreneurs to test emerging technologies, products, services, or business models at the leading edge of (or even outside of) an established regulatory framework. Sandboxes have popped up across the country — from Arizona in 2018 to Kentucky, Nevada, Utah, Vermont, and Wyoming in 2019 to Florida and West Virginia in 2020 — as a way of spurring economic growth and breaking down the barriers to market access often faced by creative business models and startups. North Carolina is one of the most recent states to investigate the potential economic benefits of becoming an innovation hub. Although North Carolina tried and failed to implement a sandbox in 2019, the 2021 iteration seems more likely to succeed given the growing number of peer states that have since adopted, or are currently working on, comparable sandbox-creating legislation.

North Carolina’s Regulatory Sandbox Act of 2021 (the “NC Sandbox Act”) seeks to establish a more flexible regulatory environment for the financial services and insurance industries within the state. Here is what you should know about the proposed NC Sandbox Act, which is currently pending before the Committee on Commerce:

Purpose and Applicability

The NC Sandbox Act would permit an applicant to temporarily test an innovative financial product or service, making such product or service available to consumers on a limited basis without subjecting the applicant company to certain licensing or other regulatory obligations otherwise imposed under applicable state law.

The NC Sandbox Act would apply to entities regulated by the Office of Commissioner of Banks or the Department of Insurance and offering a product or service that falls within the definition of an “innovative product or service,” i.e., the entities are using a new or emerging technology, or are providing products, services, business models, or delivery mechanisms not currently widely accessible to the public.

Establishment of North Carolina Innovation Council

To govern the program, the NC Sandbox Act proposes to create an “Innovation Council,” which would be tasked with supporting innovation, investment, and job creation within North Carolina by encouraging participation in the regulatory sandbox. The 11-person council would set standards, principles, guidelines, and policy priorities for the types of innovations that the regulatory sandbox program would support. Interestingly, early analysis of the bill expressly mentions authorizing the Innovation Council to focus on blockchain initiatives (here’s a legislative analysis of SB470). The Innovation Council would also be responsible for approving admission into the regulatory sandbox program.

Innovation Waiver Applications

For $50, an innovator can apply for admission into the regulatory sandbox program. In determining whether to admit an applicant, the Innovation Council will consider:

  1. The nature of the innovation product or service proposed to be made available to consumers, including the potential risk to consumers;
  2. The methods that will be used to protect consumers and resolve complaints during the sandbox period;
  3. The entity’s business plan, including availability of capital;
  4. Whether the entity’s management has the necessary expertise to conduct a pilot of the innovative product or service during the sandbox period;
  5. Whether any person substantially involved in the development, operation, or management of the innovative product or service has been convicted of or is currently under investigation for fraud or state or federal securities violations; and
  6. Any other factor that the Innovation Council or the applicable state agency determines to be relevant.

By tasking the Innovation Council with the responsibility of considering consumer protection in addition to economic growth when evaluating applicant entities, proponents of the legislation seemingly attempt to avoid some of the criticisms that surrounded the 2019 sandbox proposal.

Applicants must also have a physical presence in North Carolina. A waiver of specified requirements imposed by statute or rule may be granted as part of entry into the program and would be valid for the duration of participation in the regulatory sandbox, but typically not to exceed 24 months.

More to Come

The proposed legislation also addresses sandbox program requirements, consumer protections, record requirements, privacy, and other initiative and obligations in more detail. As mentioned above, the legislation is currently pending before the committee, and Bradley is closely monitoring this legislation and will provide continuing coverage of the proposed bill in the coming weeks and months. If passed, the legislation could become effective October 1, 2021.

If you have any questions, please reach out to the authors, Erin Illman or Lyndsay Medlin.

Energy and Infrastructure Companies Need to Know about the DOE’s and Other Agencies’ Focus on CybersecurityOn March 18, 2021, the Department of Energy’s (DOE) Office of Cybersecurity, Energy Security, and Emergency Response (CESER) announced three new research programs that are “designed to safeguard and protect the U.S. energy system” from potential cyberattacks. The DOE also announced a 100-day plan to address cybersecurity risks to the U.S. electric system. Not to be left behind, the Transportation Security Administration (TSA) issued a new security directive in light of the Colonial Pipeline cyberattack. Together, these agency actions demonstrate the scale and intensity of the threat to the energy industry and the focus of the government to curb the threat to our national infrastructure systems. Energy companies should monitor these developments and assess their internal controls to ensure they are cyber-resilient.

The Colonial Pipeline cyberattack surfaced on May 7, 2021, and confronted residents of many Southern states with a real possibility of running out of gas. But, in the days leading up to the ransomware attack, the DOE and the Biden administration were already turning their attention to cyberthreats to the energy industry. The electric system was of special concern, being another piece of critical infrastructure vulnerable to attacks — extensive power interruptions could have devasting consequences. The Colonial Pipeline cyberattack vividly demonstrates that the post-9/11 sensitivity to terrorists’ physical threats must now include cyber threats.

Less than a week after the pipeline restarted, the DOE revealed its three-prong research plan. The research programs will focus on: (1) securing against vulnerabilities in globally sourced technologies; (2) developing solutions to electromagnetic and geomagnetic interference; and (3) cultivating both research on cybersecurity solutions and the new talent needed to deploy it. The emphasis on the supply chain echoes anxieties in the Executive Order on Improving the Nation’s Cybersecurity, with its goals for the security of commercial software.

Importantly, the DOE is attempting to work with the industry. It kicked off its implementation of a 100-day plan — a plan formed by the Biden administration “to enhance the cybersecurity of electric utilities’ industrial control systems (ICS) and secure the energy sector supply chain” — by soliciting input from stakeholders. Through a Request for Information (RFI), the Office of Electricity sought comments from the public on various aspects of the electric infrastructure. When the public-comment period closed on June 7, 2021, nearly 100 entities had submitted comments. The energy industry is fully as interested in these issues as is the government.

Directly responding to the Colonial Pipeline cyberattack, the Department of Homeland Security (DHS) — through the TSA — issued Security Directive Pipeline-2021-01, aimed at tightening its control of pipelines’ security. The directive requires that critical pipeline operators (1) report cyber incidents; (2) designate a Cybersecurity Coordinator; and (3) assess, remediate, and report their cybersecurity measures. Failures to correct deficiencies or to comply with the new rules could result in substantial fines under the TSA’s enabling statute.

Federal agencies and the Biden administration are giving strong, coordinated signals that — as a result of cyber threats and attacks — lax standards, minimal enforcement, and carrots for compliance are things of the past. However, the large number of agencies and divisions with enforcement powers could make compliance confusing and difficult — especially if different critical infrastructure industries are subject to different standards. As a result, infrastructure and energy companies should take action now to harden their security measures. Best practices will help mitigate not only government scrutiny, but also the threat of an attack.

Executive Order on Cybersecurity Sets Aggressive TimelineThe Colonial Pipeline cyberattack prompted the issuance of a long-awaited executive order (EO) on improving U.S. cybersecurity. The EO mandates that, within six months, all federal agencies implement multi-factor authentication (MFA) and both at-rest and in-transit encryption. It also calls for agencies to comprehensively log, share, and analyze information about cyber incidents and creates a Cyber Safety Review Board to that end. The EO sets deadlines for agencies to write guidelines for securing software and detecting threats.

Bradley has authored prior articles and alerts regarding the U.S. governments’ increasing attention to cybersecurity — including at the Department of Defense, federal government as a whole, and even at the state level. With its focus on timelines and deadlines, this EO emphasizes the urgency of improving cybersecurity across industries.

Three goals, with a focus on timing

In a press call, the White House highlighted three goals of the EO:

  • Protect federal networks with specific tools, such as encryption, MFA, endpoint detection and response (EDR), logging, and Zero Trust Architecture.
  • Improve the security of commercial software by establishing security requirements; by using the power of the purse to prime the market for secure software; and by labelling consumer products with a cybersecurity grade.
  • Pool agencies’ information about incidents and enhance incident responses, including through a Cyber Incident Review Board (modelled on the national board that investigates plane crashes).

Reflecting the urgency of better cybersecurity, the EO sets clear, tight deadlines — more than 40 of them. The earliest deadline is set only 14 days after the EO’s release. More than 15 agencies — including the Office of Management and Budget, the Attorney General, the DoD, CISA, and NIST — are tasked with specific responsibilities to write, implement, or enforce the new measures.

Outline of the executive order

The Biden administration’s stated policy is that cybersecurity is a “top priority and essential to national and economic security.” To that end the provisions of the EO apply to “all Federal Information Systems.”

The EO specifically addresses the following issues:

  • Removing barriers to sharing threat information. The White House’s fact sheet uses the phrase “sharing between government and the private sector.” This section aims to expand the requirements on the private sector to provide incident information to the government. To that end, the EO calls for revision of both the FAR and DFARS reporting requirements. Defense contractors are already familiar with the DFARS requirement to “rapidly report” cyber incidents within 72 hours. New requirements may require less rapid reporting for less sensitive incidents.


  • Modernizing federal government cybersecurity. This section mandates specific security requirements. Before November 8, 2021, all federal agencies must implement MFA and encryption. Additionally, the EO sets a timeline for adoption of more secure cloud services and for government-wide Zero Trust Architecture. Importantly, this section repeats that the administration’s policy of “protecting privacy and civil liberties” is in tension with modern cybersecurity.


  • Enhancing software supply chain security. As chartered, NIST will shoulder the burden for establishing baseline security standards for software, including defining “critical software” and secure procedures for software development. One important component will be providing a Software Bill of Materials (SBOM), which is a record of the details and supply-chain relationships of components used to build software. The SBOM is similar to a list of ingredients on food packaging. It will allow tracking of open-source and other third-party components through a supply chain so that risks can be more easily evaluated — and patched. A second important component is a “consumer labeling program” similar to Singapore’s, for grading the cybersecurity of IoT devices.


  • Establishing a Cyber Safety Review Board. When a plane crashes, the National Transportation Safety Board investigates and makes recommendations to improve the safety of air transportation. There is no similar body for reviewing cyber incidents. The EO mandates that the Department of Homeland Security (DHS) establish just such a board, with both government and private-sector representatives having seats at the table. A senior administration official explained that the board’s first task will be to review the SolarWinds incident.


  • Standardizing the federal government’s playbook. The EO calls for creation of a “playbook” for agencies to use in responding to cybersecurity vulnerabilities and incidents. Recognizing that some such guidance has been in place for many years, the EO expressly requires that the guidance “incorporate all appropriate NIST standards.”


  • Improving detection of vulnerabilities and incidents. Agencies are called to actively hunt for threats and vulnerabilities. Each agency must submit its plan to CISA for a Continuous Diagnostics and Mitigation Program. This program has been around since 2012. The EO seeks to enhance threat-hunting activities and deployment of other Endpoint Detection and Response (EDR) initiatives.


  • Improving investigative and remediation capabilities. The very earliest deadline set by the EO is May 26 for DHS to recommend requirements for logging events and retaining other relevant incident data. The EO invites the FAR Council to consider the recommendations in its revision of the FAR and the DFARS reporting requirements.

What this means for industry

Much of the EO mandates actions by government agencies. But it does create action items for private entities. Above all, government contractors should watch for impactful changes to FAR and DFARS cybersecurity clauses. These have been revised multiple times recently, and we expect the Biden administration to revise them again — especially amid ongoing delays of the CMMC rollout. Software developers should begin inventorying their products and preparing SBOMs, especially for those in use by government agencies. Manufacturers of IoT devices should also expect that their devices must soon bear a label that marks their security level. Market forces may encourage production of higher-security devices.

Contact Andrew Tuggle, David Vance Lucas, or Sarah Sutton Osborne with any questions about the order’s impact on your business.

Circuit Split No More: 2nd Circuit Clarifies Article III Standing in Data Breach CasesWhile more states push forward on new privacy legislation statutorily granting consumers the right to litigate control of their personal information, federal courts continue to ponder how data breach injury fits traditional standing requirements. Previous to McMorris v. Carlos Lopez, McMorris v. Carlos Lopez & Assocs., LLC, many have argued there was a circuit split regarding whether an increased risk of identity theft resulting from a data breach is sufficient to establish Article III standing. However, in McMorris, the Second Circuit denied any confusion among its sister courts. Rather, the Second Circuit interestingly held that all courts have technically allowed for the possibility that an increased risk of identity theft could establish standing, but no plaintiff has yet hit the mark. Despite implying that standing could hypothetically exist in certain cases, however, the Second Circuit nonetheless found that McMorris fell short.

Devonne McMorris was an employee at a veteran’s health services clinic, Carlos Lopez & Associates LLP (CLA). In 2018, a CLA employee mistakenly sent an email to other CLA employees containing a spreadsheet with sensitive personally identifiable information (PII), including, but not limited to, Social Security numbers, home addresses, and dates of birth of McMorris and over 100 other CLA employees. McMorris and other class-action plaintiffs filed suit claiming that this purported breach caused them to cancel credit cards, purchase credit monitoring and identity theft protection services, and assess whether to apply for new Social Security numbers. The class-action plaintiffs reached a settlement with CLA, but when sent to the district court for approval, the United States District Court for the Southern District of New York rejected the parties’ agreement for lack of Article III standing. Only McMorris appealed to the Second Circuit.

The Holding

After reviewing recent decisions delivered by other circuits regarding standing and an increased risk for identity theft, the Second Circuit denied the existence of  a circuit split, stating “[i]n actuality, no court of appeals has explicitly foreclosed plaintiffs from establishing standing based on a risk of future identity theft – even those courts that have declined to find standing on the facts of a particular case.”

In deciding the present case, as a case of first impression, the Second Circuit unequivocally held that an increased risk of identity theft could be enough to establish standing, but only under the right circumstances. The Second Circuit set forth a non-exhaustive list of factors to consider:

  1. Whether the plaintiff’s data has been exposed as the result of a targeted attempt to obtain that data (which would make future harm more likely);
  2. Whether any portion of the dataset has already been misused, even if the plaintiffs themselves have not yet experienced identity theft or fraud; and
  3. Whether the type of data that has been exposed is of such a sensitive nature that the risk of identity theft or fraud is heightened.

Despite the foregoing encouragement to would-be plaintiffs, the Second Circuit then struck a blow, holding that self-created damages, in the form of proactive steps to acquire protection from future harm post-data breach, such as purchasing credit monitoring, does not establish an injury in fact. Because there was no evidence of further dissemination of the PII and McMorris’ data was not exposed as a result of a targeted hacking attempt, thereby making future harm hypothetical, McMorris lacked Article III standing. Although the data was sensitive, the court stated “[t]he sensitive nature of McMorris’s internally disclosed PII, by itself, does not demonstrate that she is at substantial risk of future identity theft or fraud.”

McMorris has large implications for both companies and victims of data breaches because the Second Circuit made sweeping proclamations about the national state of the law of standing for data breach victims. Although the refusal to recognize credit monitoring as indicia of future harm may make it difficult for would-be plaintiffs to prove heightened risk and establish standing, the Second Circuit has nonetheless created a hypothetical roadmap for doing so in an area of the law that has been analogized to the Wild West. Notably, the roadmap enumerated by the court seems to encompass the “risk of harm” analysis used by several states, namely, that if data is accessed or acquired by an unauthorized party, it is still not a data breach if there is no risk of harm to the data subject. With this in mind, companies should review their policies and procedures regarding the prevention of and reaction to data breaches. With appropriate prevention and monitoring tools, the chance of a successful “targeted attempt to obtain data,” which could result in lawsuits, is decreased. Moreover, procedures, such as encryption of sensitive data, lower the likelihood that stolen data has “a high risk for identity theft or fraud.”

Contact Lissette Payne or Lyndsay Medlin with any questions or to discuss the impact of this case. For other updates and alerts regarding data breach liability, subscribe to Bradley’s privacy blog, Online and On Point.

Florida Legislature Considers Sweeping Data-Privacy Legislation Supported by GovernorFlorida has joined the wave of states considering new comprehensive data privacy legislation. On February 15, 2021, Rep. Fiona McFarland introduced HB 969, modeled after the California Consumer Privacy Act (CCPA). The bill is supported by Gov. Ron DeSantis and the speaker of the Florida House. As introduced, HB 969 would apply to for-profit businesses that either have annual gross revenues exceeding $25 million, annually buy, sell or receive the personal information of at least 50,000 consumers or derive at least 50% of its annual global revenues from selling or sharing consumers’ personal information. A Senate version of a similar bill (SB 1734) introduced by Republican Sen. Jennifer Bradley passed through its first committee earlier this week.

Both bills impose a number of requirements on covered entities relating to consumers’ personal information – for example, entities must maintain an online privacy policy and update it annually, provide notice at the point of collection, respond to consumers’ requests for copies of their personal information or to correct such information or delete it under certain circumstances. Covered entities also must provide consumers with the right to opt out of sharing personal information, and they are prohibited from discriminating against those who choose to do so. The bills also go a step further than what is required under CCPA and include additional business obligations, such as data retention and limited use requirements.

The companion bills also provide consumers with numerous rights regarding their collected personal information, including the right to request that a business provide a copy of their personal information collected, the right to have their personal information be deleted by covered entities, and the right to have inaccurate personal data corrected.

Like the CCPA, the Florida bills provide a private cause of action against a business if there is a data breach. Similarly, the private right of action is limited to only certain data breaches. A consumer could sue a business if their nonencrypted and nonredacted personal information was stolen in a data breach as a result of the business’s failure to maintain reasonable security procedures and practices to protect it. If this happens, the consumer can sue for the amount of monetary damages actually suffered from the breach or up to $750 per incident.

For all other violations, only the Florida Department of Legal Affairs can file an action. If the department has reason to believe that any business is in violation and that proceedings would be in the public interest, the department may bring an action against such business and may seek a civil penalty of not more than $2,500 for each unintentional violation or $7,500 for each intentional violation. Such fines may be tripled if the violation involves a consumer who is sixteen years of age or younger. A business may be found to be in violation if it fails to cure any alleged violation within 30 days after being notified in writing by the department of the alleged noncompliance.

In their current form, if passed, both bills have an effective date of January 1, 2022. The legislation has been assigned to the Commerce Committee and the Civil Justice and Property Rights subcommittees. The bill has already received a favorable recommendation from the Regulatory Reform subcommittee. The companion Senate bill is also pending in committee. With the support of the governor and the speaker of the house, there is a strong possibility that some form of legislation will pass. Stay tuned for further updates and alerts from Bradley on state privacy law developments and obligations by subscribing to Bradley’s privacy blog, Online and OnPoint.

Privacy Litigation Updates for the Financial Services Sector: Claims Against Yodlee Survive and Limited Discovery of Envestnet AllowedIn November 2020, Yodlee and its parent company Envestnet filed separate motions to dismiss the class action lawsuit brought over Yodlee’s alleged data collection and use practices. Yodlee’s motion to dismiss argued that plaintiffs failed to state a claim under Federal Rule of Civil Procedure 12(b)(6), while Envestnet argued that its status as the parent company to Yodlee was not enough for the court to establish personal jurisdiction over Envestnet under Federal Rule of Civil Procedure 12(b)(2).

On February 16, 2021, Federal Magistrate Judge Sallie Kim partially granted and partially denied Yodlee’s motion to dismiss and reserved ruling on Envestnet’s motion to dismiss. The court allowed plaintiffs to cure deficiencies and file an amended complaint. On March 15, 2021, plaintiffs filed a Second Amended Complaint.

Yodlee’s Motion to Dismiss

Claims 1 and 10 – Invasion of Privacy:

The court held that plaintiffs have a reasonable expectation of privacy in their individual financial accounts. Yodlee is alleged to have improperly accessed and retained data from these personal accounts. Furthermore, Yodlee is alleged to have sold aggregated financial data that “would only take a few steps to identify the individual.”

The court denied Yodlee’s motion to dismiss Claims 1 and 10.

Claim 2 – Stored Communications Act:

The court held that plaintiffs failed to allege facts sufficient to satisfy the element of “electronic storage” because plaintiffs only alleged Yodlee “stores the information for its own misuse of the data.”

The court granted Yodlee’s motion to dismiss Claim 2 with leave to amend.

Claim 3 – Unjust Enrichment:

The court held that plaintiffs’ allegations of acquiring their data through a fraudulent scheme and selling that data was pled with enough particularity to put Yodlee on notice of the substance of the alleged fraudulent scheme.

The court denied Yodlee’s motion to dismiss Claim 3.

Claim 4 – California Civil Code § 1709:

The court found that plaintiffs sufficiently alleged Yodlee’s alleged fraudulent scheme to deceive plaintiffs.

The court denied Yodlee’s motion to dismiss Claim 4.

Claim 5 – California Unfair Competition Law – Business and Professional Code § 17200:

The court held that plaintiffs did not allege “a transaction or contract with Yodlee,” only the “Loss of Benefit of the Bargain,” and as such, it is unclear how plaintiffs “lost money or property as a result of Yodlee’s alleged conduct.” Furthermore, although plaintiffs allege the inability to seek indemnification and the heightened risk of identity theft, the court held that since neither of these have occurred yet, they are merely potential and hypothetical and not enough to have standing to bring suit over this cause of action.

The court granted Yodlee’s motion to dismiss Claim 5 with leave to amend.

Claims 7 and 9 – Computer Fraud and Abuse Act and California Comprehensive Data Access and Fraud Act:

The court held that plaintiffs’ damage claims of “the costs of conducting damage assessments, restoring the data to its condition prior to the offense, and consequential damages they incurred by, inter alia, spending time conducting research to ensure that their identity had not been compromised and accounts reflect the proper balances” were conclusory and insufficient to show damage or loss.

The court granted Yodlee’s motion to dismiss Claims 7 and 9 with leave to amend.

Claim 8 – California Anti-Phishing Act of 2005:

The court held that plaintiffs’ allegations that Yodlee represented themselves to be plaintiffs’ financial institutions, which was an allegedly fraudulent and deceitful impersonation of those institutions, and induced plaintiffs to provide their login credentials to defendants, were sufficient to state a claim under the California Anti-Phishing Act.

The court denied Yodlee’s motion to dismiss Claim 8.

Envestnet’s Motion to Dismiss for Lack of Personal Jurisdiction

The court held that plaintiffs have not alleged sufficient facts to bring an alter ego claim against Envestnet. The court noted that an alter ego claim is a rare remedy. To be invoked, the court held that there must be (1) unity of interest and (2) an inequitable result will occur if not invoked. To show unity of interest, plaintiffs should plead a fact supporting at least two or three of the following factors: “commingling of funds, identification of the equitable owners with domination and control of the two entities, instrumentality or conduit for a single venture or the business of an individual, failure to maintain minutes or adequate corporate records, use of the same office or business locations, identical equitable ownership of the two entities, use of a corporation as a mere shell, and the failure to adequately capitalize a corporation.” Furthermore, in some jurisdictions, such as the present jurisdiction, a showing of bad faith is required.

The court noted that, as it stands, plaintiffs have not alleged sufficient facts to support their alter ego claim. However, the court reserved ruling on Envestnet’s motion to dismiss until plaintiffs have an opportunity to conduct discovery on the issue. The court provided plaintiffs the opportunity to issue five document requests, five interrogatories, and five requests for admissions, as well as take one deposition of Envestnet. Plaintiffs must then file a supplemental brief no later than May 28, 2021, and Envestnet may file a response by June 11, 2021.


Many of plaintiffs’ claims have survived the motion to dismiss, bringing to light the legal and reputational risks from these data-sharing practices. Considering this pending case, businesses should review their privacy policies and procedures to ensure their data privacy compliance programs are up to date, accurately disclose their sharing practices, and protect consumer data. Based on this order, there are two significant areas to watch: anonymized, aggregated data and application programming interface (API) interactions.

Anonymized, Aggregated Data

The court found that plaintiffs have a reasonable expectation of privacy in their personal, financial accounts at an individual level. Though Yodlee argued that plaintiffs do not have a reasonable expectation of privacy in anonymized, aggregated data, the court noted that plaintiffs’ allegations that it “would only take a few steps to identify the individual Plaintiffs from the transactions.”

All businesses should review their contracts with third-party service providers, including those that provide APIs, to ensure that contractual language defining anonymized, aggregated data complies with relevant privacy laws and provides required protections, as well as defines whether and to what extent the business grants the third party permission to use and further disclose such anonymized, aggregated data.

API Interactions

Many of plaintiffs’ claims were based on the lack of and/or unclear disclosure of Yodlee’s interactions with their financial institutions. While plaintiffs allege that Yodlee does not have authority or approval from each financial institution, the use of a login screen that appears to be the financial institution is likely part of the API software agreement that the financial institutions pay to use. Businesses should ensure that any interaction with third-party processors on their websites or applications clearly and explicitly states the role of the third party and that such role is properly reflected in the businesses’ privacy policies.

If you have any questions or to discuss your company’s data sharing practices, contact Courtney Achee, Lissette Payne or Kelley Hails. For more information on this developing case and other updates and alerts regarding privacy law developments, subscribe to Bradley’s privacy blog Online and On Point.