Metrics that Move
Why Activity Counts Don’t Equal Equity

Every institution has a story like this.
A DEI office proudly reports that it hosted 50 events, trained 2,000 staff, and reached 5,000 students with its programs. The slides look impressive. The numbers fill the page. The leadership team applauds.
And then the next question comes: what changed?
That’s when the room goes quiet. Retention hasn’t moved. Gateway pass rates are flat. The gaps between white students and students of color look the same as they did last year. The activity is real, but the outcomes are unchanged.
These are vanity metrics—numbers that measure motion, not progress. They are easy to count, easy to present, and dangerously easy to hide behind. But they don’t influence budgets, they don’t shift policy, and they don’t protect equity when the political climate turns hostile.
If equity work is going to last, vanity metrics have to go.
The False Comfort of Busy Numbers
Vanity metrics thrive because they make institutions feel safe. They show that “something is happening.” They fill annual reports with data points that look impressive from a distance. But inside the institution, they don’t drive decisions.
Boards don’t reallocate millions based on event counts. Faculty don’t change pedagogy because 500 people attended a workshop. Students don’t persist because an office hosted more socials.
The danger is not just that these numbers are meaningless. The danger is that they create the illusion of impact where none exists. And in today’s climate, illusions are liabilities. Legislators and trustees who are skeptical of equity commitments are quick to seize on vanity-heavy reports. Their critique is simple: lots of money, lots of activity, no measurable results. Without outcome evidence, that critique lands.
Why Outcomes Are the Only Currency That Counts
The institutions that earn trust focus on outcomes. They report retention shifts, gateway course success, advising access, credit momentum, and climate signals that correlate with persistence. These are the metrics executives recognize because they shape revenue, rankings, and reputation.
State funding systems already operate this way. Outcomes-based funding doesn’t reward events—it rewards credit completion and graduation. Equity work that clings to vanity metrics looks out of sync with this reality.
Atlas & Crown’s research shows the same pattern inside campuses. When multicultural student programs or first-year interventions are resourced and tracked against retention outcomes, the numbers move. Not always dramatically, but measurably—and measurability is what changes the budget conversation.
When the Numbers Shift
Consider one midsize regional university. For years, its DEI office highlighted the number of workshops offered and the volume of students served by programs. Despite hundreds of activities, the one-year retention gap between Black and white students held steady at 14 points.
After an internal review, the institution dropped 80 percent of its vanity measures and refocused reporting on five outcomes: retention, gateway course pass rates, credit momentum, advising access, and climate. Within two years, after reallocating funds to tutoring in high-DFW courses and expanded MSPS staffing, the retention gap narrowed to 8 points.
The activity didn’t stop. Events still happened. Trainings continued. But leadership stopped measuring them as proof. The page shifted from “busy” to “credible.”
Why Fewer Numbers Build More Trust
There’s a counterintuitive lesson here: the fewer the metrics, the stronger the story.
When leaders are confronted with 30 measures, they cherry-pick. Success becomes whatever looks good on the slide. When they are confronted with seven measures, tied to budgets and disaggregated by race, there is no escape. The story is clear, the accountability is sharper, and the decisions are harder to avoid.
Fewer metrics also make it harder for equity work to be dismissed as fluff. A scorecard that highlights retention gains for students of color is politically harder to cut than a deck that highlights attendance totals.
The Role of Disaggregation
Even outcomes can become vanity if they aren’t disaggregated. An “overall” retention lift that masks stagnant or declining results for Black or Indigenous students is a false signal. That’s why every outcome must be race-conscious. QuantCrit makes this clear: averages can obscure harm. Reporting must insist on breaking down results by subgroup, even when the numbers are uncomfortable. Especially when they are uncomfortable.
When institutions resist disaggregation, they aren’t protecting themselves—they’re erasing the very inequities they claim to address.
Why Small Lifts Matter
Another trap is dismissing small gains. Leaders often ask: Is a 2 percent retention increase worth celebrating?
The answer is yes. A two-point gain at scale translates into hundreds of students persisting, millions in tuition revenue, and measurable improvement in equity gaps. Small numbers compound into large impact over time. What matters is that they are disaggregated, tied to budget logic, and reported consistently.
Vanity metrics inflate progress in ways that don’t hold up. Outcome metrics, even small ones, provide durable proof.
The Cost of Clinging to Vanity
Institutions that cling to vanity metrics pay three hidden costs:
Faculty disengagement. Faculty see little connection between workshop attendance counts and classroom realities, and tune out DEI reports.
Student skepticism. Students notice the gap between reported activity and lived experience. They stop trusting the institution’s promises.
Board frustration. Trustees, under pressure to show return, dismiss vanity-heavy reports as spin. That frustration often translates into cuts.
In each case, vanity numbers don’t just fail to help—they actively hurt credibility.
Moving Beyond Performance Theater
The temptation to report activity is strong, especially when pressure to “show impact” is high. But performing equity through busy slides is a short-term survival strategy. It buys applause in one meeting and costs credibility in the next.
Retiring vanity metrics is an act of discipline. It means resisting the urge to fill slides with what looks good and instead publishing what moves outcomes. It means being transparent about blind spots. It means choosing fewer, harder, more consequential measures—and then standing by them.
That discipline is what builds resilience.
Why This Matters Now
The political climate around equity is not forgiving. Legislatures are limiting disaggregated reporting. Boards are cutting DEI lines. Presidents are being asked to defend every dollar. In this environment, vanity metrics are indefensible. They invite criticism without offering protection. Outcome metrics, by contrast, tie equity directly to institutional survival. When equity investments are shown to improve retention, stabilize revenue, and enhance reputation, the work becomes harder to dismiss.
Boards and executives are not persuaded by activity—they are persuaded by outcomes. That’s the reality equity leaders have to embrace.
Activity counts are easy to celebrate but impossible to defend. Outcomes are harder to achieve but easier to protect.
If equity work is going to survive turbulence, it has to retire vanity metrics and embrace outcome measures that drive decisions. That shift builds trust with students, credibility with faculty, and protection with boards.
If your institution is still measuring activity instead of outcomes, it’s time to change the story. Book a consult with Atlas & Crown today and let’s build the scorecard that proves equity delivers.