Improvement Strategy

The EASIER Framework

A diagnostic model for identifying what an improvement initiative is actually trying to accomplish. Six dimensions. One primary center of gravity. Every strategic approach, metric, and tool follows from that choice.

The Problem It Solves

Most improvement efforts don't know what problem they're solving

Organizations adopt new tools, redesign processes, and scale programs without first agreeing on the nature of the constraint. They optimize for speed when the real problem is accuracy. They expand reach when the program isn't working yet. They automate inefficiency into the workflow rather than removing it.

The EASIER framework starts with a diagnosis. Leaders identify one primary dimension: the center of gravity for the initiative. That choice determines the approach, the tools, and the metrics.

There's also a sequencing requirement built in: don't optimize a broken process, don't scale an ineffective program, don't invest in Speed before confirming Accuracy. Skipping those checks is where most improvement efforts go wrong.

The core question

“Which EASIER dimension is the primary constraint on our outcomes?”

How it's used

  1. 1

    Describe the improvement opportunity or challenge.

  2. 2

    Work through the diagnostic sequence to identify the primary dimension.

  3. 3

    Check the prerequisite map: are earlier conditions met?

  4. 4

    Choose strategies, tools, and metrics aligned to that dimension.

The Six Dimensions

The two E dimensions have a fixed sequence: E¹ (Effectiveness) always precedes E² (Efficiency).

E¹

Effectiveness

Achieve the intended outcome.

When to use

The approach itself isn't producing the intended result. Not too slow or too expensive. Just not working.

Effectiveness is the foundation of the framework. Before pursuing Efficiency or Reach, Effectiveness must be confirmed. Do not optimize a broken process. Do not scale an approach that doesn't work.

PREREQUISITE: Confirm Effectiveness before pursuing Efficiency or Reach.

Key metrics

Outcome attainment rateGoal achievement rateProgram completion rateSuccess rate

Examples

  • Increasing student graduation rates
  • Improving patient health outcomes
  • Improving policy impact
  • Increasing product reliability
A

Accuracy

Improve the correctness of information, predictions, or decisions.

When to use

Leaders are making poor decisions because the data they rely on is wrong, incomplete, or inconsistent.

Accuracy is about whether the organization can trust what it knows. The problem is data quality, measurement reliability, or evaluation consistency. Not a lack of causal understanding. The core question is whether the signals are correct. That's different from Insights, which asks whether they're understood.

Confirm Accuracy before investing heavily in Speed. Fast decisions on bad data don't just fail. They fail faster.

Key metrics

Error rateFalse positive/negative ratePrediction accuracyInter-rater reliability

Examples

  • More reliable identification of at-risk students
  • Reduced data errors in reporting
  • More consistent evaluation and admissions decisions
  • More accurate demand forecasting
S

Speed

Reduce the time required for a process or decision.

When to use

The work is directionally correct, but delay has measurable consequences: missed windows, harm from slow response, lost opportunity.

Speed matters when timing is the binding constraint, not just when things feel slow. Confirm Accuracy before optimizing Speed, and confirm Effectiveness before investing heavily in Speed. Making a broken process faster magnifies the problem.

PREREQUISITE: Confirm Accuracy before optimizing Speed. Confirm Effectiveness before investing heavily in Speed.

Key metrics

Cycle timeResponse latencyTime-to-decisionTime-to-intervention

Examples

  • Faster admissions decisions
  • Real-time early-alert systems
  • Reduced approval cycle times
  • Faster crisis response
I

Insights

Discover why outcomes are what they are.

When to use

Leaders have data but lack understanding. Patterns are invisible, causes are unknown, and interventions keep producing the same inadequate results.

Insights serves a special diagnostic role: when you can't clearly identify which primary EASIER dimension applies, Insights work is how you find out. Use Insights to diagnose before committing to a dimension. If the primary dimension isn't clear, Insights comes first.

Insights ≠ Accuracy. Accuracy: are the signals correct? Insights: do we understand what they mean? Different problems, different fixes.

Key metrics

Causal hypotheses confirmed vs. refutedDecisions changed by new evidenceRoot causes documented

Examples

  • Discovering hidden drivers of student attrition
  • Identifying what actually predicts advisor effectiveness
  • Understanding why a program works in some contexts but fails in others
  • Revealing unmet demand patterns
E²

Efficiency

Reduce the resources required to produce the same outcome.

When to use

Effectiveness is confirmed, but the current cost in time, money, or labor is unsustainable.

Efficiency always follows Effectiveness. That sequencing is not optional. E¹ (Effectiveness) asks: Is the work achieving the right outcome? E² (Efficiency) asks: Is the work sustainable? Efficient execution of the wrong approach is waste. Fix the process first, then optimize it.

PREREQUISITE: Confirm Effectiveness (E¹) before pursuing Efficiency (E²).

Key metrics

Cost per unit of outcomeLabor-to-output ratioWaste rateStaff-to-caseload ratio

Examples

  • Automating repetitive administrative tasks
  • Reducing cost per enrolled student
  • Improving advisor caseload capacity without reducing quality
  • Eliminating low-value processes and redundant systems
R

Reach

Expand who benefits from what the organization does.

When to use

The core work is effective, but scale, geography, or structural access barriers limit who benefits. The question is both how many and who.

Volume and equity are separate questions. An organization can serve large numbers while consistently missing populations by income, race, geography, or first-generation status. Any Reach initiative should name which populations are underserved and why.

EQUITY: Two organizations with identical scale numbers can have radically different equity profiles. Who is being left out matters as much as how many are being served.
PREREQUISITE: Confirm Effectiveness before expanding Reach. Scaling an ineffective program harms more people.

Key metrics

Coverage rateEquity gaps across subgroupsGrowth in underserved populations reachedAccess barrier reduction

Examples

  • Serving more students without proportional staffing increases
  • Expanding access for first-generation or rural students
  • Closing equity gaps in program participation
  • Reducing structural access barriers

The Diagnostic Sequence

When the primary dimension isn't obvious, work through this in order. Stop at the first question that reveals the constraint. Insights comes first: if you don't understand why outcomes are what they are, you can't reliably pick the right dimension.

1

Do we understand WHY current outcomes are what they are?

If yes: ContinueIf no: → INSIGHTS first
Insights
2

Is the core approach actually producing the intended outcome?

If yes: ContinueIf no: → EFFECTIVENESS
Effectiveness
3

Is the information driving decisions reliable and accurate?

If yes: ContinueIf no: → ACCURACY
Accuracy
4

Is timing a binding constraint? Does delay cause direct harm or missed opportunity?

If yes: → SPEEDIf no: Continue
Speed
5

Is cost or sustainability the binding constraint?

If yes: → EFFICIENCYIf no: Continue
Efficiency
6

Is scale or equitable access the binding constraint?

If yes: → REACHIf no:
Reach

The Prerequisite Map

These aren't suggestions. Skipping them is where improvement efforts go wrong.

Do not pursue Efficiency until Effectiveness is confirmed.

Efficient execution of the wrong approach is waste. Automating a broken process makes it worse.

Do not pursue Reach until Effectiveness is confirmed.

Scaling an ineffective program harms more people and wastes more resources.

Confirm Accuracy before investing heavily in Speed.

Fast decisions on bad data don't just fail. They fail confidently.

When the primary dimension is unclear, begin with Insights.

You can't pick the right dimension if you don't understand why outcomes are what they are.

Framework in Practice

Same problem. Six different strategies.

The dimension you choose determines the work, the investments, and what success looks like. Here's how one common challenge plays out differently depending on which dimension you identify as primary.

Problem: Student advising is falling short.

Effectiveness

Redesign advising to actually move graduation rates. Test what works.

Accuracy

Build better risk identification so advisors find the right students.

Speed

Intervene faster; catching students two weeks earlier matters.

Insights

Understand the real drivers of attrition, not just the proxies.

Efficiency

Reduce advisor workload per case so each conversation is higher quality.

Reach

Support more students, especially first-gen and low-income, without adding headcount.

Each path is valid. Each leads to different work, different investments, and different metrics. The framework doesn't pick the answer. It makes the choice visible so the right people can make it, and so the prerequisites get checked before the strategy starts.

The core principle

Every improvement initiative should identify one primary dimension. That's its center of gravity.

Other dimensions still matter. But the initiative needs a center of gravity. Trying to improve everything at once leads to unclear priorities, mismatched tools, and metrics that don't tell you anything. Pick the dimension that matters most. Check the prerequisites. Let that choice drive the work.

Downloads

EASIER One-Pager

A single-page overview of all six dimensions: when to use each, key metrics, prerequisite rules, and the diagnostic sequence.

Download PDF

Diagnostic Guide

A practical guide for applying the framework: how to identify the right primary dimension, the diagnostic sequence, prerequisite checks, and common pitfalls.

Download PDF

Framework Reference

The complete framework reference: all six dimensions with full definitions, canonical metrics, prerequisite map, Insights diagnostic role, and the E¹/E² relationship.

Download PDF

Try the AI tool

Open the EASIER App

Describe a challenge. The app identifies the right dimension, asks follow-up questions, and generates a structured analysis with strategic approaches and metrics.