Lab

Lab · Planning & governance

Intentional evolution, not accidental drift

CDE (Capability-Driven Evolution) is a conceptual model and type system for how and why a system should change. Change is motivated by formalized drivers, scored on explicit value dimensions, expressed through capability transformations, and executed inside resource-bounded iterations, with a full reconciliation loop when execution diverges from the plan.

  • Capability-centric: express evolution as what the system can do, orthogonal to how it is built from components.
  • Driver-motivated: every slice of work traces to a need; technical debt and policy pressure are first-class drivers.
  • Iteration-bounded: discrete cycles with budgets and timeboxes, not infinite-scope “roadmaps without physics.”
  • Transparent trade-offs: weights, anchored scores, and derived effort make disagreements about facts, not folklore.

6-step

Dimension derivation

From beneficiaries and conflicts to weighted value dimensions with anchored 1–5 scales: comparability without pretending scores are universal.

10-step

Post-iteration reconciliation

Structured outcome capture: achieved vs target, divergence classes, unplanned capability work, emergent deliveries, actuals, narratives, and model updates.

Derived

Driver cost

Costs sum unique capability state transitions; shared evolution across drivers is counted once, enabling honest iteration budgets.

Why capability-driven

Components change constantly; capabilities describe what must stay true

If planning only tracks tickets and repositories, you optimize for activity. If planning tracks capability levels and the drivers that justify motion, you optimize for stakeholder value under hard constraints and preserve an audit trail when priorities shift.

Vocabulary

Core concepts

Precise words prevent “the roadmap said X but the code did Y” without anyone being able to reconstruct why.

System

The bounded whole that delivers value.

Defined externally by capabilities (verbs) and internally by components (nouns). Those views are orthogonal (many-to-many, not a tidy 1:1 map).

Capability

What the system can achieve: outcome-oriented and measurable.

Implementation-independent, composable, traceable to beneficiaries and forward to work. The aggregate of capabilities is the external character of the system.

Component

What the system is made of: ownable, deployable pieces.

Structural decomposition; capabilities stay stable across technology churn when scoped well.

Scope

Boundaries drawn around a capability across dimensions.

Functional, operational, performance, interface, and temporal scope together prevent “everything is in scope” ambiguity.

Value dimension

An axis on which drivers are scored, with explicit weights.

Each dimension carries a scale definition so “4 on Stakeholder Value” means something concrete and reviewable (recommended periodic review, e.g. 90 days).

Feedback → Driver

From raw signal to formalized force.

Feedback is heterogeneous; drivers are source-agnostic, solution-neutral, prioritizable, and traceable, with a lifecycle from emerging to addressed or obsolete.

Closed loop

From signal to shipped capability, and back

Step 1

Feedback

Raw signal from operations, users, market, regulation, telemetry: unstructured but real.

Step 2

Drivers

Formalized needs that planning can reason about without prescribing implementation.

Step 3

Selection

Prioritization, dependency order, and budget feasibility pick what enters the next iteration.

Step 4

Capability targets

Gap analysis sets required capability states; evolution is ordinal along defined state spaces, not binary feature flags.

Step 5

Component work

Engineering decomposes capability motion into buildable increments inside the timebox.

Step 6

Evolved system

Observation produces new feedback, and the loop closes deliberately rather than via backlog noise.

Priority, effort, and efficiency

Each driver is scored per value dimension using that dimension’s anchored scale; a weighted sum yields priority. Weights are an explicit policy choice, revisable when strategy shifts, rather than a hidden spreadsheet.

Effort lives on capability state transitions: each non-zero state carries an estimate to reach it from the prior ordinal. A driver’s cost is the sum of unique transitions implied by its requirements; duplicates collapse, so shared foundations are priced once. Efficiency becomes value per unit of that derived cost, respecting dependency graphs that pure scoring ignores.

Iteration lifecycle

Only planned, in-progress, and completed iterations shape the committed trajectory; draft and proposed iterations are negotiation artifacts.

Pre-commitment

Draft

Internal prep: drivers and targets forming.

Pre-commitment

Proposed

Client review: scope and budget under negotiation.

Post-commitment

Planned

Approved path; waiting to start.

Post-commitment

In progress

Resources flowing; transitions underway.

Post-commitment

Completed

Outcomes recorded; system state advanced.

Terminal

Cancelled

Terminated; leaves the active evolution path.

Educational scenario

Walkthrough: Millbrook City (composite scenario)

The conceptual walkthrough follows a fragmented municipal services stack: permits in spreadsheets, licenses in a legacy desktop app, utilities split across vendor portals, enforcement in email threads. Leadership lacks a unified picture of citizen journeys.

The narrative derives five value dimensions (citizen experience, operational efficiency, governance visibility, risk reduction, strategic enablement), negotiates weights, captures verbatim feedback, formalizes drivers like cross-department request visibility, defines ordered capability state spaces (e.g. request tracking from manual tracking to proactive notifications), and plans a Q1 “Foundation” iteration whose budget is the deduplicated sum of required transitions, not three times the same database migration because three drivers mention it.

Talking about CDE by audience

Executive pitch (two minutes)

Systems evolve whether or not you design for it. CDE makes evolution intentional: define what “better” means, capture the forces that demand change, and run resource-bounded iterations that maximize value delivery. Every euro traces from stakeholder need through capability motion to budget as visible trade-offs instead of backlog politics.

CDE and Agile

Agile describes how to execute inside a timebox; CDE describes what belongs in the timebox and why. Sprints still need a principled backlog: drivers, scores, capability targets, and validated dependency paths supply that backbone without replacing team rituals.

AudienceLead withLight touch on
ExecutivesAccountability, allocation, risk, traceable decisions; evolution changelog as receipts.Deep schema jargon; lead with outcomes and governance.
ProductTransparent prioritization: disagree on weights or facts, not mystery math.Over-indexing on types; emphasize facilitation and conflict surfacing.
EngineeringState spaces, dependency proofs, feasibility before commitment.Hand-waving rigor; formalism is a feature for this audience.
Operations / supportTickets and incidents become feedback with urgency, not a black hole.Abstract vocabulary without operational examples.

Iteration planning mechanics

Rank drivers, estimate transitions, deduplicate shared work, buffer for confidence, validate dependencies, document rationale in the changelog.

Rank, then respect physics

Drivers ordered by priority still must fit budget and topological constraints. Iterate selection until the path is valid or prerequisites are pulled forward.

Unique transition accounting

Multiple drivers needing the same capability jump pay for that transition once, which yields realistic costing for shared foundations.

Confidence-aware buffers

Low-confidence estimates carry explicit buffers; investigation spikes can precede expensive transitions.

Planned vs achieved

Completed iterations record achieved states, divergence classifications, unplanned capability outcomes, emergent deliveries, effort and timeline actuals. The model distinguishes forecast history from factual history.

After the iteration: reconcile reality

Completion is not the absence of surprises; it is the disciplined recording of surprises so the next plan starts from truth, not hope.

  • Classify divergence when targets differ from delivery: budget constraint, estimation error, blocker, descoping, windfall, prerequisite discovery, scope expansion.
  • Capture work that landed outside the plan: unplanned capability advances vs emergent deliveries that suggest new capabilities or infrastructure.
  • Recompute estimation quality, update system ordinals, and write decision entries when state spaces split or rationales go stale.

Design principles

Capability-centric

Express change as capability motion, not only file churn.

Driver-motivated

No work item without a traced driver; pressure sources include debt and compliance.

Iteration-bounded

Evolution ships in discrete, resource-constrained slices.

Explicitly prioritized

Weights and scales are negotiable artifacts, so arguments become inspectable.

Emergent system state

Overall system position is derived from capability levels, not hand-waved.

Facilitation red flags

“Just tell us what to build”

Likely signal

Solution hunger vs owning trade-offs

Response: Position CDE as the way they keep agency over decisions.

“Everything is critical”

Likely signal

Missing real prioritization

Response: Force rank and make opportunity cost visible on the same dimensions.

“We don’t have time for process”

Likely signal

Prior process trauma

Response: Start with one driver and one capability slice; show value before scaling ceremony.

Deep dives

Governance encoded where teams tend to lose discipline

Health monitoring as a teaching loop

The type system and health predicates encode governance habits: stale dimension rationales, incomplete reconciliation after completion, overlapping unplanned vs planned work, lineage integrity after capability splits. Flags are educational: they tell a team where its model of reality is thin.

Velocity as learning

Across iterations, priority delivered per euro spent becomes a retrospective signal, not a vanity metric. It tightens estimation and sharpens which drivers belong together in a single timebox.

Caveats

CDE describes a governance and modeling discipline. Organizational adoption, tooling maturity, and data quality still dominate outcomes. The framework makes trade-offs legible; it does not remove politics or uncertainty.

Want to explore CDE on a system you are evolving?