Here's what everyone tells you about CMMC Level 1: "It's just 17 requirements. Self-attested. No big deal."

Here's what actually happens when you sit down to do it: you realize "17 requirements" means 17 practices, each of which needs a policy, a procedure, evidence you're doing what the policy says, and a written attestation you're willing to sign. Some practices are one-liners. Others require you to document entire categories of behavior your company has never formally written down.

Then you have to submit a score to SPRS. Your prime pulls it from PIEE. If your documentation is vague, your score gets challenged — and you either scramble to fix it or lose the contract.

This post is the thing we wish every small contractor had before they started. It's not a replacement for a proper readiness review. But it will tell you what "passing-grade" documentation looks like, where people typically fail, and how to avoid having your self-attestation rejected.

First — the number everyone gets wrong

CMMC Level 1 requires 17 practices, not 15. The confusion comes from the fact that Level 1 was originally framed around the 15 "basic safeguarding requirements" in FAR 52.204-21. The final CMMC 2.0 framework pulled in two additional practices. If anyone in your organization is operating on the "15 practice" number — correct it before you submit anything.

The 17 practices break into six families:

Each practice needs documentation that answers three questions about your business:

  1. What is the policy? (The rule.)
  2. What is the procedure? (How you carry out the rule.)
  3. What is the evidence? (Proof you actually do it.)

If your documentation only answers question 1, it's a policy — not a Level 1 attestation. This is the single most common mistake we see.

What "passing-grade" documentation actually looks like

Let's take a real Level 1 practice as an example: AC.L1-3.1.1 — Limit system access to authorized users, processes, and devices.

The wrong way to document this (most first drafts):

"Only authorized personnel have access to our systems. We use strong passwords. Access is granted based on job role."

That's three sentences of policy. No procedure, no evidence, no specificity. A reviewer or a prime can't do anything with this.

The passing-grade version:

Policy: "System access is granted only to personnel whose roles require it. Access is provisioned through written approval by [the CTO / IT Manager / designated role] and tracked in [your ticketing system / access log]. Access is reviewed quarterly and revoked within one business day of termination or role change."

Procedure: "New user access requests are submitted via [form / email / ticket] to [role]. Approval requires documented business justification. Provisioning is performed by [role] using [system — e.g., Okta, Microsoft 365 admin, Google Workspace admin]. All provisioning events are logged. Quarterly, [role] reviews active accounts against current personnel and removes inactive or misaligned accounts."

Evidence: "Access request tickets for the last 90 days. Quarterly access review records. Ticket numbers and timestamps showing account removal within one business day of termination events."

See the difference? The passing-grade version names roles, systems, and timeframes the reviewer can verify. The failing version is just rhetoric.

The four mistakes that get SPRS submissions rejected

1. Vague language masquerading as policy

Phrases like "industry best practices," "appropriate controls," "reasonable measures," and "as needed" are reviewer red flags. They suggest you don't actually have a defined process. Replace every instance of these with a specific role, frequency, or system.

Bad: "Accounts are reviewed regularly." Good: "Accounts are reviewed quarterly by the IT Manager."

2. Copy-paste policies that don't match your environment

Small contractors often buy a policy template pack, find-and-replace "[Company Name]," and submit it. A reviewer can spot this in 30 seconds — the policies reference systems you don't use, roles you don't have, or procedures impossible in your environment.

If you use a template, edit it ruthlessly to match reality. If your template says "the Chief Information Security Officer performs a weekly review" and you don't have a CISO, change it to whoever actually does the review (and if no one does, that's the problem to fix — not the policy to lie about).

3. Aspirational documentation

Writing policies describing what you will do instead of what you currently do is a rejection. Self-attestation is a statement about the present state of your environment, not a roadmap.

Bad: "We will implement multi-factor authentication on all cloud services by end of year." Good: "Multi-factor authentication is currently enforced on all cloud services. Evidence: Okta/M365 tenant configuration screenshots dated [Q1 2026]."

If you can't make the "good" version true yet — don't attest yet. Remediate first, then attest.

4. No evidence referenced at all

Some contractors submit a documentation bundle that describes every practice beautifully — and includes zero artifacts. The reviewer has no way to verify anything. Every practice should have a named evidence source: a screenshot, a log export, a ticket range, a reviewed document, a meeting minute.

You don't have to attach the evidence to your SPRS submission, but you need it on file and referenced by name in your documentation. When your prime asks to see it, you have one business day — not one week — to produce it.

What to actually build before you submit

Based on working with small contractors through this, here's the minimum viable Level 1 documentation package:

  1. A short System Security Plan (SSP) — 5–10 pages describing your environment, boundary (what's in scope), and inventory of systems that touch FCI.
  2. A policy document covering all 17 practices — one page per practice, each answering policy / procedure / evidence. About 20–30 pages total.
  3. An evidence index — a spreadsheet listing every piece of evidence you're relying on, where it's stored, and who owns it.
  4. Your SPRS scoring worksheet — your actual self-scored numbers with math you can defend.
  5. A "before submit" checklist — a sanity pass that catches the four mistakes above.

Most small contractors take 4–8 weeks to build this from scratch, including the back-and-forth with their IT person to confirm what's actually in place.

If you want this done faster

Our free 5-minute scoping tool will confirm you actually need Level 1 (not Level 2, not nothing). After that, there are two paths forward that work.

Our Level 1 Template Pack ($999) is a productized version of the documentation we deliver on our consulting engagements. It includes all six documents described above — the SSP, the 17 practice-level policy documents, the evidence index workbook, the SPRS scoring worksheet, the pre-submit review checklist, and a boundary diagram template. Every policy has passing-grade example language written the way auditors want to see it, with specific role names and frequencies already filled in for small contractor environments. You edit it to match your actual stack.

The Pro tier ($1,999) adds an AI-assisted pre-submit review — upload your edited documents and get specific feedback on vague language, missing evidence, and rejection risk before you submit to SPRS — plus a 30-minute consultation call with one of our practitioners. For most small contractors, this is the version that matters. It exists specifically to eliminate doubt before you sign a federal attestation.

See the Level 1 Template Pack →

Either way: don't submit before someone who's done this before looks at your draft. SPRS rejections are fixable, but they delay contracts, and delayed contracts cost orders of magnitude more than a couple of hours of review.


vCISOx helps small defense contractors scope, prepare for, and pass CMMC assessments. If you're earlier in the process, start with the free scoping tool — it'll tell you honestly whether you need Level 1, Level 2, or neither.