CMMC Controls List: How Many Controls by Level?
CMMC
/
March 24, 2026

The CMMC controls list varies by level. Level 1 includes 17 practices focused on safeguarding federal contract information. Level 2 includes 110 requirements aligned to NIST SP 800-171. Control count alone does not determine effort—scope, ownership, and evidence discipline drive readiness. Understanding how many controls in CMMC Level 1 and Level 2 helps with staffing, tool coverage, and assessment planning.
Introduction
CMMC requirements under CMMC 2.0 can influence how a contractor designs systems, documents workflows, and prepares for acquisition reviews. The updated CMMC 2.0 framework clarifies expectations around safeguarding federal contract information and Controlled Unclassified Information within defined system boundaries. A strong technology stack still needs clear scope and consistent proof, since assessments focus on what is implemented within a defined boundary and how it is operated over time.
This page provides a plain-language view of a controls list, the control counts used in the current model, and a practical way to organize owners and evidence. It is written for security, IT, compliance, and program leaders supporting Department of Defense work where contract data is handled in contractor environments. It is also a reference for planning discussions that involve staffing, tooling, and schedule tradeoffs. It supports consistent decisions on scope and proof.
CyberCrest offers readiness planning, scope definition, documentation alignment, and assessment preparation. The goal is a maintainable program that matches contract expectations, limits rework, and reduces disruption during assessment activity.
Controls, Practices, and Requirements
Teams often say “controls” when they mean a mix of technical settings, procedures, and governance activities. In the Cybersecurity Maturity Model Certification program, the work is defined as specific practices and requirements that must be met within a documented boundary.
A CMMC controls list is most useful when it is treated as a working inventory of required CMMC controls, what must be implemented, who owns it, and what evidence proves it is operating. In CMMC 2.0, documented ownership and traceable proof are central to demonstrating alignment with required CMMC controls. This approach keeps checklist work tied to operations instead of a one-time document build.
Key concepts to align early:
- Assessment boundary (CMMC assessment scope): the information systems and supporting assets that process, store, or transmit Federal Contract Information (FCI) or Controlled Unclassified Information (CUI) for the contract workflow, as defined by CMMC scoping requirements.
- Control owner: the person accountable for the routine that keeps a safeguard operating.
- Evidence: objective proof that the requirement is met in scope, dated and traceable to the boundary.
Control Counts and Structure in CMMC 2.0
CMMC 2.0 is a three-level model designed to scale safeguards to the sensitivity of data in scope. The CMMC 2.0 structure emphasizes alignment with NIST SP 800-171 while clarifying expectations for assessment and affirmation. Many cost and schedule issues come from confusion about what “control count” means and how it maps to day-to-day work.
A useful way to read the model is to separate counts from implementation:
- Counts describe how many defined practices or requirements apply to a level.
- Implementation describes what must be in place, documented, and operating in the boundary.
This section also answers two common planning questions: how many controls in CMMC level 1 and how many controls in CMMC level 2.
CMMC Controls by Level
At a planning level, CMMC controls by level can be summarized as:
- Level 1: 17 practices aligned to basic safeguarding under the Federal Acquisition Regulation, focused on federal contract information. Level 1 is typically met through annual self-assessment with an annual affirmation by a senior official [2].
- Level 2: 110 requirements aligned to NIST SP 800-171 Rev.2 for CUI, supported by deeper documentation and operational proof [4].
- Level 3: Level 2 plus DoD-selected enhanced security requirements from NIST SP 800-172 (February 2021), assessed by the U.S. Government through DCMA DIBCAC under 32 CFR Part 170.
Control counts help shape staffing, tool coverage, and evidence workload. They do not replace scoping. A controls list becomes accurate only when it reflects the CMMC assessment scope and the real systems used for delivery.
Read also: Understanding CMMC 2.0 Levels: A Guide for Defense Contractors
Why Control Count Does Not Equal Effort
Two organizations can target the same level and face different workload. Effort depends on scope size, inherited services, and how consistently controls are operated.
Common factors that increase effort:
- A boundary that includes shared systems not dedicated to contract work.
- Identity and endpoint tools that are not configured consistently across the boundary.
- Evidence that exists in tickets and logs but is not organized for review.
- Decentralized ownership where no one is accountable for cadence and sign-off.
- Programs that depend on outside vendors without a clear interface for requests and proof.
A controls list should include the boundary objects that matter, such as systems, applications, accounts, and support paths. When that list is linked to owners and evidence, planning becomes more predictable.
Before the Controls List: Scope and Evidence Foundations
Controls lists fail when teams start remediation before the boundary is stable. A stable boundary supports cleaner implementation, stronger evidence, and less disruption during third party assessments.
Use this sequence:
- Map where covered data enters the organization and where it flows.
- Define an enclave or boundary that limits systems in scope while supporting delivery.
- Inventory endpoints, servers, applications, identities, and privileged roles in scope.
- Document dependencies across internal networks, cloud services, operational technology, and external service providers.
- Build an evidence index that links each requirement to specific artifacts and owners.
Foundational work products to maintain:
- A documented system security plan describing the assessment boundary and supporting architecture.
- Data flow summary for contract workflows.
- Inventory extracts for assets, users, and privileged roles.
- Repository structure and naming standards.
- Plan for required records in the Supplier Performance Risk System.
When DFARS 252.204-7021 is included in a solicitation or contract, the required CMMC level becomes an explicit condition of award and performance, so the controls list, owners, and evidence plan should be tied to that clause language [3].
Department of Defense (DoD) estimates and program updates can shift over time. Contract clauses and internal milestones should drive the project plan.
Level 1 Controls: Practical Coverage Areas
Level 1 is designed for baseline safeguarding when contract data is present in covered contractor information systems. The objective is consistent basic cyber hygiene and evidence that safeguards operate as part of daily work.
Level 1 work is easier to manage when it is grouped into operational areas instead of tracked as isolated items.
Identity and Account Management
Coverage goals:
- Issue accounts through documented approval steps.
- Limit privileged access and review it on a cadence.
- Remove access during offboarding in a defined timeframe.
Evidence to retain:
- Account lifecycle procedures with an assigned owner.
- Samples of creation and removal records.
- Review notes for privileged access.
Endpoint Baselines and Patch Execution
Coverage goals:
- Keep managed devices current on updates.
- Maintain endpoint protection settings and alert workflows.
- Track exceptions and corrective actions.
Evidence to retain:
- Patch status reports for in-scope endpoints.
- Endpoint protection configuration exports and alert handling tickets.
- Malicious code protection policy records for devices in scope.
Facility and Physical Safeguards
Coverage goals:
- Restrict facility access to sensitive areas.
- Maintain visitor controls and access logs.
- Secure backup media and portable devices.
Evidence to retain:
- Visitor logs or access badge reports for restricted locations.
- Locked storage checklists or photos tied to dates and owners.
- Maintenance access records for sensitive areas.
Media and Data Handling
Coverage goals:
- Control removable media use.
- Define approved transfer methods for contract deliverables.
- Record disposal actions for retired media.
Evidence to retain:
- Media handling rules and exceptions.
- Disposal or destruction confirmations.
- Approved transfer method documentation.
Level 2 Controls: Family-Based Organization
Level 2 aligns to the 110 requirements in the 800-171 Rev.2 requirement set and is designed to protect Controlled Unclassified Information handled by contractors. A controls list is easier to run when it is organized into requirement families with clear owners and proof expectations.
Below is a practical structure using common family labels and abbreviations. It supports planning, evidence mapping, and ongoing operation.
Access Management (AC)
Focus: enforce access control by limiting system access to authorized information system users and approved actions. Effective access control reduces risk by ensuring only approved personnel receive information system access necessary for contract delivery.
Evidence themes:
- Role design tied to job responsibilities and documented access control policies.
- Periodic reviews to verify and limit information system access to authorized activities.
- Approval workflow for privileged access.
- Periodic reviews with documented outcomes.
Awareness and Training (AT)
Focus: ensure personnel understand policies, reporting, and acceptable use.
Evidence themes:
- Training content and completion records.
- Role-based training for administrators.
- Policy acknowledgment records.
Audit and Accountability (AU)
Focus: record events that support investigation and accountability.
Evidence themes:
- Logging configuration for key systems.
- Protected storage of audit logs and retention settings.
- Dated review records and follow-up actions.
Configuration Management (CM)
Focus: define baselines and control changes to key systems.
Evidence themes:
- Approved baseline configurations.
- Change records with approvals and validation outputs.
- Exception register with review cadence.
Identification and Authentication (IA)
Focus: confirm identity and authenticate users before granting access.
Evidence themes:
- Strong authentication for remote access and privileged actions.
- Account management records.
- Identity provider configuration exports.
Incident Response (IR)
Focus: prepare to detect, respond, and recover from incidents.
Evidence themes:
- Incident procedures and escalation paths.
- Exercise records and corrective actions.
- Incident reporting workflow records.
Maintenance (MA)
Focus: control maintenance tools and access paths.
Evidence themes:
- Time-bound support access records.
- Vendor support approvals and review notes.
- Tool use records tied to service activity.
Media Protection (MP)
Focus: control media use, transport, storage, and disposal.
Evidence themes:
- Handling rules and exception tracking.
- Storage controls for backups and portable media.
- Disposal evidence.
Personnel Security (PS)
Focus: manage access during onboarding, role changes, and offboarding.
Evidence themes:
- Onboarding checklists tied to access grant approvals.
- Offboarding records with access removal proof.
- Role change approvals.
Facilities Safeguards (PE)
Focus: enforce physical protection measures and limit physical access to systems, devices, and media within the defined assessment boundary.
Evidence themes:
- Facility access lists and visitor records.
- Secure storage records for sensitive assets.
- Review records for facility access.
Risk Assessment (RA)
Focus: identify threats and prioritize mitigations.
Evidence themes:
- Risk assessment method and results.
- Risk acceptance records with owners and review cadence.
- Mitigation tracking with closure proof.
Assessment and Testing (CA)
Focus: test safeguards and confirm they are effective.
Evidence themes:
- Internal review schedule and results.
- Corrective action tracking and closure evidence.
- Mapping of assessment objectives to artifacts.
System and Communications Protection (SC)
Focus: protect data flows and network communications.
Evidence themes:
- Boundary diagrams and segmentation decisions.
- Approved encryption and transfer methods.
- Remote access configuration exports.
System and Information Integrity (SI)
Focus: vulnerability management, integrity monitoring, and deployment of malicious code protection mechanisms across in-scope systems.
Evidence themes:
- Vulnerability scan results and remediation tracking.
- Patch compliance reports for key systems.
- Monitoring and alert handling records that show integrity monitoring.
Turning the Controls List into Tickets, Owners, and Cadence
A list becomes operational when each item has an owner, a cadence, and a definition of proof. Many teams maintain a controls list as a spreadsheet or GRC register with consistent fields.
Suggested fields to track:
- Control or requirement reference.
- In-scope systems and applications.
- Control owner and backup owner.
- Procedure reference and evidence location.
- Evidence type and collection cadence.
- Review cadence and sign-off method.
- Open gap status and closure criteria.
This structure keeps organizational communications clear. Control owners know what they must run, what proof they must store, and how proof will be reviewed.
“A controls register works when it assigns ownership, defines evidence, and sets a cadence that teams can keep during delivery.” (CyberCrest Compliance Team)
Evidence That Stays Useful
Evidence tends to decay when it is collected only during a readiness sprint. A usable evidence set is collected during normal operations and stored with dates and scope context.
Strong evidence traits:
- It is tied to the boundary and includes system identifiers.
- It shows operation over time, not a single configuration snapshot.
- It can be reproduced by the system owner.
- It links back to a documented procedure.
Common evidence artifacts:
- Identity exports, access review records, and approval tickets.
- Patch and vulnerability reports with remediation closure proof.
- Change records tied to baseline changes.
- Logging configuration exports and review notes.
- Backup test records and corrective actions
Sustainment After the First Review
A controls list is not static. Staff changes, tool changes, and new contract workflows can create drift. Sustainment keeps the boundary accurate and keeps proof current.
Sustainment actions to schedule:
- Periodic access reviews with documented outcomes.
- Patch and vulnerability routines with closure proof.
- Log review cadence with retained review notes.
- Backup tests with recorded results and follow-up actions.
- Internal reviews that confirm procedures match real settings.
A sustainment plan also clarifies what must be updated when scope changes. This reduces late remediation work and supports faster reassessment cycles.
Using the Controls List During an Assessment
A controls list supports assessment execution when it is paired with an evidence index and a clear request workflow. During interviews and technical testing, assessors often sample a subset of items and ask for proof that spans multiple dates.
Assessment readiness steps to include in the plan:
- Confirm the boundary and inventory exports match what will be presented during review.
- Assign a single owner to manage evidence requests and track responses.
- Pre-stage exports, reports, and tickets that show operation across time.
- Define rules for redaction and sharing of sensitive artifacts.
- Record follow-up actions and link closure proof back to the controls register.
This structure reduces time spent searching for artifacts and keeps review activity aligned to the documented boundary.
Read also: CMMC Audit Guide: Compliance Roadmap
Common Errors to Avoid
Controls lists often break down due to scope drift, weak evidence, and mismatched documentation.
Frequent issues and fixes:
- Scope grows after remediation begins. Fix: freeze scope for defined windows and revalidate inventories on a cadence.
- Controls exist in tools but not in routines. Fix: define cadence, owner accountability, and review records.
- Proof relies on screenshots without context. Fix: use exports, reports, and tickets with dates and identifiers.
- Vendor access is uncontrolled. Fix: document support access paths, enforce time limits, and review records routinely.
- Written descriptions conflict with settings. Fix: update procedures and system descriptions after changes and revalidate during readiness reviews.
Where CyberCrest Fits
CyberCrest offers a structured approach that aligns scope, controls, and proof. Support can include boundary definition, gap validation, evidence mapping, and preparation for the certification process when required.
This approach helps organizations maintain CMMC compliance, reduce assessment disruption, and protect covered contract data during contract performance while supporting national security expectations.
Conclusion
A controls list is useful when it is tied to a stable boundary, clear owners, and dated evidence that shows safeguards operate in scope. Level 1 focuses on baseline safeguarding for contract data. Level 2 aligns to the 110 requirements in the 800-171 Rev.2 requirement set and is easier to manage by families, owners, and routines. Level 3 builds on Level 2 by adding DoD-selected enhanced security requirements from NIST SP 800-172 (February 2021), which are assessed by the U.S. Government through DCMA DIBCAC under 32 CFR Part 170.
CyberCrest can help contractors translate requirements into scoped work, defensible proof, and repeatable operations
The result is a program that supports contract timelines and reduces disruption during assessment activity. It also supports clearer decisions during bids and renewals.
CyberCrest offers services for contractors that need a controls plan that can be executed and maintained. Engagements can start with scope definition and a readiness review, then move into evidence mapping, documentation alignment, and remediation planning tied to the target level.
Schedule a consultation to review target solicitations, confirm the boundary, and build a controls plan with owners and proof standards. The outcome is a clearer roadmap, fewer late surprises, and a readiness posture that can be sustained during contract performance. Teams can also request a guided review of evidence quality before an assessment window. Deliverables can include an updated controls register, an evidence index, and a prioritized remediation plan.
{{cta}}
References
- 32 CFR Part 170, Section 170.14, “CMMC Model” (GovInfo, current CFR text) https://www.govinfo.gov/link/cfr/32/170?link-type=pdf§ionnum=14&year=mostrecent
- 32 CFR Part 170, Section 170.15, “CMMC Level 1 self-assessment and affirmation requirements” (GovInfo, current CFR text) https://www.govinfo.gov/link/cfr/32/170?link-type=pdf§ionnum=15&year=mostrecent
- 48 CFR DFARS 252.204-7021, “Cybersecurity Maturity Model Certification Requirements” (Acquisition.gov, current clause text/PDF) https://www.acquisition.gov/sites/default/files/current/dfars/252.204-7021.pdf
- NIST Special Publication 800-171 Revision 2, “Protecting Controlled Unclassified Information in Nonfederal Systems and Organizations” (Official NIST PDF) https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-171r2.pdf


FAQ
How Many Controls Are in Level 1 and Level 2?
Organizations often ask how many controls in CMMC level 1 and how many controls in CMMC level 2 when planning staffing and timelines. Level 1 is commonly described as 17 practices. Level 2 is commonly described as 110 requirements aligned to the 800-171 Rev.2 requirement set.
Is a Controls List the Same as Passing an Assessment?
A list is a planning tool. Assessments validate what is implemented in scope and what proof shows operation over time. Ownership, cadence, and evidence quality matter as much as the list.
What Should be Included in the Scope?
Scope should include the systems, users, and services that store, process, or transmit covered data for the contract workflow. A stable boundary and current inventories reduce rework.
Can One Controls List Support Multiple Contracts?
A controls register can support multiple bids when the same environment supports delivery. Contract-specific scope decisions still need review during each cycle.











