A CMMC Level 2 assessment is not a control exam. It's an evidence exam. You can have every NIST 800-171 control implemented correctly and still not pass, because the C3PAO can only score what you can prove.
These are the five failure modes we see most often.
1. Evidence that describes, not demonstrates
The most common mistake. A policy says "we review access quarterly." That's a description. The evidence the assessor needs is a record of an actual quarterly review — a ticket, a signed report, an email thread — from within the assessment period.
If your evidence starts with the word "should," it's a policy. If it starts with a date, a name, or a screenshot, it's evidence.
2. Screenshots without context
A screenshot of a configuration setting is not evidence on its own. It needs three things:
- System identification — which host, which tenant, which account
- Timestamp — when the screenshot was taken
- Traceability — who took it and against which control
Screenshots captured during testing with no metadata are nearly useless at an assessment. Build the capture habit into your evidence workflow, not the night before.
3. Evidence outside the boundary
We've seen teams spend weeks producing beautiful evidence for systems that aren't in scope, while the actual CUI-handling systems have thin documentation. The assessor scopes against your System Security Plan boundary, not your whole company.
Before you produce a single piece of evidence, redraw your boundary diagram and list every system inside it. Evidence effort goes to those systems. Period.
4. Conflicting evidence
A policy says passwords rotate every 90 days. The system shows 180-day rotation. A user interview says "I've never rotated my password." That's three conflicting signals on one control, and it's worse than no evidence at all — it tells the assessor that what's written doesn't match what's happening.
Before an assessment, every control needs an internal reconciliation: what do the docs say, what does the system do, what will users report? All three must agree.
5. Treating the SSP as marketing
The System Security Plan is a legal document, not a pitch deck. Avoid:
- Forward-looking language ("We will...", "By Q3 we plan to...")
- Aspirational descriptions that don't match the current state
- Vague phrases like "industry best practices" or "appropriate controls"
Write the SSP the way you'd testify about your environment. Because that's what it is — a sworn statement about what's in place today.
The pattern behind all five
Every one of these is the same root mistake: treating evidence as something you gather at the end, rather than something you generate continuously.
Environments that pass on the first attempt typically have:
- A control-to-evidence map from day one of the program
- Automated evidence collection for at least the top 40% of controls (logging, config, access)
- A monthly evidence review cycle that catches drift before the assessor does
If you're more than six months from your assessment window and you don't have those three things, that's where your next sprint should go — not another policy rewrite.
We run pre-authorization assessments that simulate the C3PAO engagement before it counts. Book a discovery call to see what yours would look like.