CMMI-DEV Workspace: From Ad Hoc to Appraised in 22 Process Areas

A practical guide for process improvement consultants and engineering leaders on using Rakenne's CMMI-DEV workspace template to build a complete, appraisal-ready process documentation set — structured by maturity level with tool-assisted validation at every step.

  • intermediate
  • 25 min read
  • 2026-03-23
  • Skills
Author Ricardo Cabral · Founder

Preparing for a CMMI-DEV appraisal is one of the most documentation-intensive undertakings in software engineering. A first-time ML3 appraisal typically requires 12–18 months of process improvement work, with organizations documenting 18 process areas across project management, engineering, and organizational disciplines. The bottleneck is rarely knowing what to write — it is translating real engineering practices into the structured, evidence-based documentation that a SCAMPI Lead Appraiser expects to see.

Rakenne’s CMMI-DEV Process Improvement workspace template provides 22 specialized skills covering every CMMI-DEV v2.0 process area from Maturity Level 2 through 5. Each skill enforces a structured workflow, loads CMMI-specific references, and uses deterministic tools to check the agent’s output — catching the kinds of errors that plain LLM drafting misses: incomplete generic practices, orphaned work products, inconsistent terminology between project-level and organizational-level documents, and missing traceability between process descriptions and actual artifacts.

This guide walks through all 22 process areas in maturity-level sequence, shows real dialog excerpts from a live session, and explains what makes tool-assisted process documentation materially better than generic AI drafting or template-based approaches.


Why plain LLMs fall short for CMMI

A plain LLM can draft process descriptions, procedures, and plans. Where it struggles is appraisal-grade rigor:

ConcernPlain LLMRakenne with CMMI-DEV skills
Generic practice coverageCan miss GG2/GG3 requirementsValidates that each process area addresses both Specific Goals and Generic Goals for the target maturity level
Cross-PA traceabilityWeak without structured stateEnforces links between project plans, org standards, measurement data, and process performance baselines
Work product completenessMay list artifacts without verifying contentValidation tools check that each expected work product exists and meets minimum content requirements
Consistency across levelsProject-level docs may contradict org-level standardsCross-checks that tailored project processes align with organizational process definitions
Repeatable processOutput varies with prompt phrasingEach skill defines a fixed workflow; same checks run every time

The difference is structural: skills give the agent a spec (workflow + CMMI references + expected work products) and tools (deterministic checks) to verify its own output. This is what turns a draft into appraisal-ready evidence.


Maturity levels: a progressive journey

Unlike ISO frameworks that are pass/fail, CMMI-DEV is a maturity progression. Each level builds on the one below:

LevelNameProcess AreasWhat it means
ML1Initial0Ad hoc, chaotic — success depends on individual heroics
ML2Managed7Projects are planned, executed, measured, and controlled
ML3Defined18Organization-wide standard processes; projects tailor from a shared asset library
ML4Quantitatively Managed20Statistical process control using performance baselines
ML5Optimizing22Continuous improvement through causal analysis and innovation

The workspace template installs all 22 skills but adapts to your target. When you tell the agent “our target is ML3”, it scopes the workflow to 18 process areas and calculates completion accordingly.


The 22-process-area journey

ML#Process AreaAbbreviationWhat gets validated
ML21Project PlanningPPEstimate basis, schedule dependencies, resource allocation, risk identification
ML22Requirements ManagementREQMBaseline integrity, bidirectional traceability, change control procedures
ML23Configuration ManagementCMCI identification, baseline records, change request workflow
ML24Project Monitoring & ControlPMCTracking frequency, status report completeness, corrective action criteria
ML25Process & Product QAPPQAAudit coverage, noncompliance escalation, objectivity
ML26Measurement & AnalysisMAObjective–metric alignment, data collection procedures, analysis methods
ML27Supplier Agreement ManagementSAMSOW completeness, acceptance criteria, oversight procedures
ML38Organizational Process DefinitionOPDStandard process completeness, tailoring guidelines, asset library structure
ML39Organizational Process FocusOPFImprovement plan coverage, appraisal action items, deployment tracking
ML310Organizational TrainingOTNeeds analysis, curriculum coverage, competency records
ML311Integrated Project ManagementIPMTailoring rationale, stakeholder involvement, org standard compliance
ML312Requirements DevelopmentRDStakeholder elicitation, requirement allocation, validation criteria
ML313Technical SolutionTSAlternative evaluation, design decision rationale, interface specifications
ML314Product IntegrationPIIntegration sequence, build verification, interface management
ML315VerificationVERPeer review coverage, verification method selection, defect tracking
ML316ValidationVALValidation environment, acceptance criteria, operational scenario coverage
ML317Risk ManagementRSKMRisk taxonomy, mitigation strategy, trigger conditions
ML318Decision Analysis & ResolutionDARCriteria weighting, alternative scoring, decision rationale
ML419Organizational Process PerformanceOPPBaseline statistical validity, model accuracy, data sufficiency
ML420Quantitative Project ManagementQPMSPC chart interpretation, process stability, objective achievement
ML521Causal Analysis & ResolutionCARRoot cause identification method, action effectiveness, defect prevention
ML522Organizational Performance ManagementOPMBusiness objective alignment, innovation pipeline, deployment tracking

Maturity Level 2 — Managed: establishing project discipline

Step 1: Project Planning — the foundation everything else references

Every CMMI engagement starts with Project Planning. The PP skill produces a project plan that PMC, IPM, and RSKM all reference — making it the foundational artifact of the entire process area hierarchy.

Project Planning (PP)

You: We’re a 120-person defense software contractor in Arlington, VA. Our target is ML3. Let’s start with Project Planning for our flagship radar signal processing project, codenamed APEX.

Agent: I’ve set the target maturity level to ML3 (18 process areas). Let me draft the project plan for Project APEX. First, I need to establish the planning parameters.

I’ve drafted the project plan covering scope (radar signal processing software, embedded C++/FPGA), WBS with 6 work packages, effort estimates based on historical data (COCOMO II adjusted for your domain), schedule with milestones and dependencies, resource allocation for 14 team members, and 8 identified risks.

Running the PP validation suite…

Estimate basis checker: PASS. All effort estimates reference historical data or parametric models. The COCOMO II calibration factor accounts for safety-critical software complexity.

Schedule dependency validator: PASS. All 23 task dependencies are acyclic. Critical path identified: requirements → architecture → unit implementation → integration test → system validation. Total duration: 14 months.

Risk identification completeness: WARNING. 8 risks identified, but no risk addresses ITAR export control requirements for the FPGA firmware — a common audit finding for defense contractors. I’ve added RSK-009: “ITAR compliance gap in firmware distribution chain” with mitigation referencing your export control officer.

I’ve also updated the dashboard: Project Planning is complete, ML2 phase started, and currentMaturityLevel set to ML2.


Step 2: Requirements Management — maintaining the baseline

The Requirements Management skill establishes a requirements baseline with bidirectional traceability. It ensures that every requirement is tracked from source (customer need) through implementation to verification — the chain that appraisers follow.

Requirements Management (REQM)

You: Establish requirements management for Project APEX. We have 47 system requirements from the prime contractor’s SRS and 12 derived requirements from our architecture team.

Agent: I’ve built the requirements baseline with 59 entries (47 allocated + 12 derived) and the bidirectional traceability matrix. Let me validate.

Baseline integrity checker: PASS. All 59 requirements have unique IDs, status (Approved/Draft), source reference, and priority. The 12 derived requirements each trace back to at least one parent allocated requirement.

Traceability completeness: WARNING. 3 allocated requirements (REQ-031, REQ-038, REQ-044) have no downstream trace to design elements yet. These are “forward traceability gaps” — expected at this stage but flagged for the Technical Solution step. I’ve marked them in the matrix.

Change control procedure: PASS. The change request workflow includes impact analysis, CCB review criteria, baseline update procedures, and notification to affected stakeholders.


Step 3: Configuration Management — controlling what you build

Configuration Management (CM)

You: Set up configuration management for APEX. We use Git for source code, Jira for work items, and Confluence for documentation.

Agent: I’ve drafted the CM Plan identifying 34 configuration items across 4 categories: source code (18 modules), documentation (8 documents), test artifacts (5 test suites), and build artifacts (3 deliverables). Let me validate.

CI identification completeness: PASS. All 34 items have unique identifiers, owners, and classification levels. Naming convention follows MIL-STD-973 format adapted for your project.

Baseline record validator: PASS. Three baselines defined — Functional Baseline (requirements), Allocated Baseline (design), and Product Baseline (code + tests). Each baseline has entry/exit criteria, approval authority, and audit trail requirements.

Change request workflow: PASS. The CR workflow includes initiation, impact analysis, CCB decision, implementation, verification, and baseline update — with Git branch protection rules mapped to each stage.


Steps 4–7: completing ML2

The remaining ML2 process areas follow the same pattern:

  • Project Monitoring & Control (PMC) — defines earned value tracking procedures, status report templates, and corrective action triggers. The validation suite checks that monitoring frequency aligns with project milestones and that corrective action plans reference specific PP baselines.

  • Process & Product Quality Assurance (PPQA) — produces the QA plan with audit schedules, checklists, and noncompliance escalation procedures. The objectivity checker verifies that QA auditors are independent from the work they review.

  • Measurement & Analysis (MA) — establishes measurement objectives aligned to project and organizational goals, defines data collection procedures, and creates the measurement repository schema. The alignment checker verifies that every metric traces to a stated information need.

  • Supplier Agreement Management (SAM) — documents supplier agreements, SOWs, and acceptance criteria. Only applicable if the organization uses external suppliers; the agent skips this if it learns no suppliers are in scope.

After completing all 7 ML2 process areas, the dashboard shows:

ML2 Complete

Agent: All 7 Maturity Level 2 process areas are now complete. Dashboard updated:

  • ML2 Completion: 100%
  • Overall Completion: 39% (7/18 for ML3 target)
  • Documents Produced: 12
  • Current Maturity Level: ML2

Ready to proceed to ML3. I recommend starting with Organizational Process Definition (OPD) — it establishes the standard processes that all other ML3 skills reference.


Maturity Level 3 — Defined: standardizing across the organization

ML3 is where the biggest shift happens. Instead of each project defining its own processes, the organization establishes standard processes that projects then tailor. This is also where most organizations target their first CMMI appraisal.

Step 8: Organizational Process Definition — the standard process library

OPD is the foundation of ML3. It produces the organization’s standard process set, tailoring guidelines, and process asset library — the artifacts that every project will reference through IPM.

Organizational Process Definition (OPD)

You: Define our organizational standard processes. We have 3 active projects (APEX, BEACON, CITADEL) that should all follow the same process framework.

Agent: I’ve drafted the organizational standard process set with 8 defined processes covering the full development lifecycle: requirements management, design, implementation, integration, verification, validation, configuration management, and project management. Each process includes purpose, entry/exit criteria, activities, roles, work products, and measurement points. Let me validate.

Standard process completeness: PASS. All 8 processes have the required elements per CMMI OPD SP 1.1. Each process references applicable CMMI practice areas and maps activities to specific goals.

Tailoring guidelines validator: PASS. The tailoring guidelines define 3 project categories (safety-critical, mission-critical, general) with mandatory and optional process elements for each. Tailoring rationale must be documented and approved by the process group.

Process asset library structure: PASS. The PAL includes process descriptions, templates, checklists, lessons learned, and measurement data — organized by process area with versioning and ownership.


Step 9: Organizational Process Focus — driving improvement

Organizational Process Focus (OPF)

You: Create the process improvement plan. Our last informal assessment identified weaknesses in peer review practices and inconsistent risk management across projects.

Agent: I’ve drafted the Process Improvement Plan (PIP) with 6 improvement initiatives prioritized by impact and feasibility. The top priorities are:

  1. Peer review standardization — Define common review criteria, checklists, and defect classification across all projects
  2. Risk management harmonization — Align risk taxonomies and mitigation strategies with the organizational standard

Running validation…

Improvement plan coverage: PASS. All 6 initiatives have defined objectives, success criteria, resource requirements, schedule, and responsible owners. Each initiative traces to an identified process weakness — either from the informal assessment, project retrospectives, or QA findings.

Appraisal action items: The PIP includes preparation activities for the target SCAMPI A appraisal in Q4 2027 — mock appraisal (Q2 2027), evidence collection, and interview preparation.


Steps 10–18: building the engineering and support processes

The remaining ML3 process areas build on OPD’s standard processes:

  • Organizational Training (OT) — needs analysis identifies 4 training audiences (project managers, developers, QA engineers, process group) with role-specific curricula. Competency records track completion.

  • Integrated Project Management (IPM) — takes the ML2 project plan and aligns it with organizational standards. The tailoring rationale checker verifies that every deviation from the standard process is documented and approved.

  • Requirements Development (RD) — extends REQM by eliciting stakeholder needs, deriving customer requirements, and allocating to product components. The allocation completeness checker verifies every requirement traces to at least one component.

  • Technical Solution (TS) — selects and designs solutions. The decision rationale checker verifies that alternative evaluations use defined criteria and that design decisions are traceable to requirements.

  • Product Integration (PI) — plans the integration sequence, defines interface management procedures, and establishes the build environment. The integration sequence validator checks that dependencies are respected.

  • Verification (VER) — plans verification methods (peer reviews, testing, analysis) for each work product. The peer review coverage checker verifies that all critical work products have scheduled reviews with defined entry/exit criteria.

  • Validation (VAL) — ensures the product works in its intended operational environment. The validation environment checker verifies that test conditions represent real operational scenarios.

  • Risk Management (RSKM) — extends ML2’s basic risk identification into a systematic risk management strategy. The risk taxonomy checker verifies comprehensive coverage across technical, programmatic, and business risk categories.

  • Decision Analysis & Resolution (DAR) — formalizes significant decisions. The criteria weighting checker verifies that evaluation criteria are explicitly weighted and that scores are consistently applied across alternatives.

ML3 Complete

Agent: All 18 process areas through Maturity Level 3 are complete. Dashboard updated:

  • ML2 Completion: 100%
  • ML3 Completion: 100%
  • Overall Completion: 100% (18/18 for ML3 target)
  • Documents Produced: 38
  • Current Maturity Level: ML3
  • Appraisal Readiness: 87%

The 87% readiness score reflects that all process areas have documented artifacts, but 3 areas need additional evidence from actual project execution (peer review records, measurement data, and training completion records). These will strengthen your evidence base for the SCAMPI appraisal.


Maturity Level 4 — Quantitatively Managed: statistical process control

ML4 adds statistical rigor. Most organizations reach ML4 only after operating at ML3 for 1–2 years, accumulating enough measurement data to establish meaningful baselines.

Step 19: Organizational Process Performance — establishing baselines

Organizational Process Performance (OPP)

You: We’ve been operating at ML3 for 18 months and have measurement data from 6 completed projects. Establish our process performance baselines.

Agent: I’ve analyzed the measurement repository and established 4 process performance baselines: defect density (defects per KSLOC by severity), schedule estimation accuracy (actual vs. planned duration ratio), peer review effectiveness (defects found per review hour), and requirements volatility (change rate per month). Let me validate.

Statistical validity checker: PASS. All 4 baselines are derived from at least 6 data points (projects), with documented central tendency, variation (standard deviation), and confidence intervals. The defect density baseline shows a mean of 2.3 defects/KSLOC with σ=0.8 — consistent enough for statistical process control.

Model accuracy: PASS. The process performance model predicts defect injection rates based on complexity and developer experience. Back-testing against 3 held-out projects shows prediction error within ±15% — acceptable for project planning use.


Step 20: Quantitative Project Management — using baselines for decisions

QPM applies the OPP baselines to active project management, using statistical techniques to detect process instability and predict outcomes.


Maturity Level 5 — Optimizing: continuous improvement

ML5 closes the loop with systematic root cause analysis and strategic performance management.

Step 21: Causal Analysis & Resolution — fixing the system, not just the symptom

Causal Analysis & Resolution (CAR)

You: We’ve had 3 recurring integration test failures in the last quarter. Analyze the root causes.

Agent: I’ve conducted a causal analysis using the Ishikawa (fishbone) method on the 3 integration test failures. Root causes identified:

  1. Interface specification ambiguity — 2 of 3 failures traced to incomplete interface control documents (ICDs) that didn’t specify error handling behavior
  2. Build environment drift — 1 failure caused by library version mismatch between development and integration test environments

Running validation…

Root cause identification method: PASS. The analysis used a structured method (Ishikawa), involved relevant stakeholders (lead developers, integration testers, architect), and traced each failure to a specific process weakness rather than blaming individuals.

Proposed actions:

  • Action 1: Add mandatory error handling sections to ICD template (addresses root cause 1, prevents similar failures in RD and TS process areas)
  • Action 2: Implement containerized build environments with pinned dependency manifests (addresses root cause 2, prevents environment drift)

Action effectiveness criteria: PASS. Both actions have defined success metrics — zero ICD-related integration failures in the next 2 sprints, and zero environment-related failures after container adoption.


Step 22: Organizational Performance Management — strategic improvement

OPM completes the CMMI journey by aligning process improvement with business objectives and deploying innovations across the organization.


Cross-process-area traceability: the appraiser’s perspective

SCAMPI Lead Appraisers don’t evaluate process areas in isolation — they trace chains. “Show me the organizational standard process. Show me how this project tailored it. Show me the measurement data that proves it works.” If one link breaks, the appraiser rates the goal as unsatisfied.

Rakenne’s cross-PA validation tools enforce these links automatically:

  • Project Plan → Organizational Standards: IPM’s tailoring rationale checker verifies that every project process references the OPD standard and documents deviations
  • Requirements → Design → Verification: The traceability chain from RD through TS to VER is checked bidirectionally — every requirement has a design element, every design element has a verification method
  • Measurement Data → Baselines: MA data feeds OPP baselines; the statistical validity checker ensures baselines are derived from sufficient, representative data
  • QA Findings → Process Improvement: PPQA audit findings feed OPF improvement plans; the reconciliation checker ensures no finding is orphaned
  • Risk Register → Project Plan: RSKM risks cross-reference PP risk sections; the alignment checker ensures consistency

When you run the appraisal readiness assessment after completing all process areas, these checks aggregate into a single readiness score — broken down by CMMI category (Project Management, Engineering, Process Management, Support, High Maturity).


The dashboard: tracking progress across maturity levels

As each skill completes, the agent updates the project dashboard. The dashboard provides a single view of process improvement progress:

Key metrics tracked:

  • Program Completion — percentage of process areas completed for the target maturity level
  • Target vs. Current Maturity Level — where you’re aiming vs. where you are
  • ML2/ML3/ML4/ML5 Completion — per-level progress bars
  • Process Area Capability — radar chart showing current vs. target capability ratings across all PAs
  • Documents Produced — total output artifacts
  • Risk Distribution — breakdown by severity
  • Improvement Actions — open vs. closed from OPF
  • Appraisal Readiness — weighted composite score across 5 categories

The dashboard adapts to the target maturity level. If you’re targeting ML3, the ML4 and ML5 progress bars show as “Not in scope” rather than penalizing you for incomplete high-maturity process areas.


Effort comparison: consultant time with and without tool assistance

Based on typical CMMI consulting effort breakdowns for first-time ML3 appraisal preparation:

Activity% of effortTool-assisted acceleration
Process definition (OPD, tailoring guidelines)20%Standard process templates with completeness validation reduce iteration cycles
Project-level process documentation (PP, REQM, CM, etc.)25%Structured workflows ensure each PA addresses all Specific and Generic Goals
Risk assessment and measurement framework15%Cross-PA traceability checks catch inconsistencies between risk registers and project plans
Training and competency documentation8%Needs analysis templates with curriculum coverage validation
QA and process improvement12%Audit objectivity checking, finding-to-improvement traceability
Appraisal preparation (evidence collection, interviews)15%Appraisal readiness scoring identifies evidence gaps before the SCAMPI team arrives

The heaviest activities (process definition, project documentation, evidence collection) are exactly where the validation tools add the most value — not by replacing consultant judgment, but by catching the structural gaps and traceability breaks that consume review cycles and lead to “Not Satisfied” ratings.


Getting started

  1. Create a new project in Rakenne and select the CMMI-DEV Process Improvement workspace template
  2. All 22 skills are automatically installed
  3. Tell the agent your target maturity level — ML2, ML3, ML4, or ML5
  4. Start with Project Planning (PP) to establish the foundation, then follow the maturity-level sequence
  5. Use the dashboard to track progress per maturity level and monitor appraisal readiness

Each skill is independent but reads artifacts from earlier steps. You can run them in any order, but the recommended sequence ensures each skill has the context it needs from prior outputs — and that the artifact chain will survive SCAMPI scrutiny.


Summary

The CMMI-DEV Process Improvement workspace template turns appraisal preparation from a scattered documentation effort into a structured, validated process. The 22 skills cover every CMMI-DEV v2.0 process area from ML2 through ML5, and the validation tools enforce the same checks a SCAMPI Lead Appraiser would apply — consistently, automatically, and traceably.

What sets this apart from template-based approaches or generic AI drafting is the maturity-level-aware progression and cross-process-area traceability. The workspace adapts to your target — whether you’re a small team targeting ML2 or a defense contractor pursuing ML5. And at every step, deterministic tools verify that project plans align with organizational standards, requirements trace through design to verification, measurement data supports process baselines, and improvement actions close the loop from findings to prevention.

The result is not a set of generic process templates. It is a set of internally consistent, organization-specific, CMMI-aligned artifacts that reference each other, trace goals to practices to work products to evidence, and flag gaps before an appraiser does.

Try it yourself

Open a workspace with the skills described in this article and start drafting in minutes.

Get Started Free — No Sign-Up

Ready to let your expertise drive the workflow?

Stop wrestling with rigid templates and generic chatbots. Describe your process, let the agent handle the rest.

Get Started Free — No Sign-Up