Faster Toward What?
(This is part 2 of a 3 part series exploring the core ideas of my book Fragile Systems: An Ecological Approach to AI in Government)
There's a seductive logic to efficiency in government. Faster is better. Streamlined is superior. But when it comes to AI in local government, our obsession with speed and savings is leading us toward a dangerous cliff. Our biggest barrier in the public sector is trust, and I am not aware of a single healthy relationship built on efficiency. But I know of many that have lasted decades because they are founded on trust.
Efficiency isn't neutral in civic life. It reflects assumptions about what counts, who matters, and which tradeoffs we're willing to accept. AI can compress cycle times, but unless we understand how time and attention actually function inside our organizations, we risk accelerating the wrong things: brittle processes, opaque decisions, and resident frustration dressed up as throughput. True value comes from ecological fit — strengthening the living system of people, policies, and relationships that constitute governance.
How many of us dreamed of a better relationship? If you've been married more than two minutes, I know you have! When we dream of a better relationship, our mind often goes to trust. Maybe you think of time spent, as either a qualitative or quantitative measurement, and sometimes both. Not once, have I ever dreamed of a more efficient relationship with a friend or family member. Not once have I ever thought, "You know one way to improve my marriage is to streamline our dates."
Instinctively and almost subconsciously, we understand that healthy relationships are not built by speed or efficiency. They are built on trust, and trust takes time.
The same applies for civic relationships.
Not All Time Is Equal
Not all time has equal value in civic ecosystems. The ten minutes a housing caseworker spends with a vulnerable client forms a critical connection point, potentially preventing homelessness and stabilizing multiple lives. Meanwhile, an hour spent generating a standardized monthly report may contribute minimal ecological value.
When we implement AI tools that save staff five minutes across dozens of tasks, those fractional savings often dissipate as unusable energy throughout the system. Without intentional cultivation of how reclaimed time will function, the potential energy simply redistributes along existing pathways.
AI often 'returns' time in fragments: five minutes here, eight minutes there. On their own, those scraps rarely cohere into capacity. Only when leaders intentionally re-aggregate them does time turn into real value.
A simple exercise: take a single process and map time types across it. Label moments that anchor trust, then protect them from automation. If automation touches them anyway, pair it with a countervailing investment in human contact.
Ask three practical questions before you greenlight a time-saving tool:
- What kind of time is being saved? Is it generative time (where staff build trust, exercise discretion, prevent harm) or maintenance time (rote tasks with low relational value)?
- Where will saved minutes land? Can you bundle them into blocks that support reflective work, cross-team coordination, or proactive outreach?
- Who converts saved time into outcomes? If no one "owns" the conversion, those minutes evaporate into busyness — the system's equivalent of heat loss.
A simple exercise: take a single process (e.g., housing assistance intake) and map time types across it. Label moments that anchor trust ("keystone minutes"), then protect them from automation. If automation touches them anyway, pair it with a countervailing investment in human contact.
Friction Isn't Failure
In ecosystems, some friction is protective. Pauses allow context to surface; redundancies keep systems resilient; second looks prevent harm. When we shave away every delay in permitting, code enforcement, or benefits review, we may also remove the very checks that create legitimacy.
Two ways to separate helpful friction from waste:
- Name the function. If a step converts uncertainty into judgment (e.g., a planner clarifying intent with an applicant), it likely has ecological value. Keep it — and equip it.
- Shift the friction, don't delete it. Move it earlier (clearer guidance, examples, and pre-submission checks) or deeper (structured escalation for edge cases) rather than letting it accumulate at the counter, where it feels like bureaucracy.
A practical habit: for each AI use case, define two beneficiaries in advance: The staff role that should see workload relief and the resident profile that should experience clearer, fairer service. If you can't name both, the benefit will likely pool in one place and erode somewhere else.
Perhaps most dangerously, efficiency improvements often treat symptoms while leaving root causes untouched. In civic ecosystems, friction often serves essential functions. Just as natural systems rely on resistance, boundaries, and cyclical delays, governance benefits from deliberate pause points that allow for adaptation, input gathering, and system safeguards.
AI vendors typically present efficiency metrics that reflect mechanical aspects of system performance: hours saved, reduced call volume, accelerated response times. Yet these measurements capture only a fragment of the civic ecosystem's functioning. They fail to monitor relationship quality, system adaptability, equity of outcomes, or long-term resilience—the true indicators of ecological health.
A practical habit: for each AI use case, define two beneficiaries in advance: (1) the staff role that should see workload relief and (2) the resident profile that should experience clearer, fairer service. If you can't name both, the benefit will likely pool in one place and erode somewhere else.
Procurement as Ecosystem Design
When looking to implement AI, you're not just buying software; you're introducing a new species into your civic habitat. Treat contracts as environmental commitments, not just price sheets.
If we're serious about responsible AI adoption, we must expand beyond the question of how much time is saved and begin asking what kind of time is being saved, and what it allows us to become. This requires developing sensory capabilities to perceive what truly matters in living systems: civic trust, staff wellbeing, system learning capacity, and community resilience.
Instead of asking "How much will this save us?" we need to ask ecological questions: How will this intervention affect the web of relationships that make up our civic ecosystem? Does it strengthen our ability to learn collectively rather than merely perform individually? Does it preserve space for human judgment, compassion, and community voice in decision-making?
These aren't efficiency questions. They're stewardship questions, designed not to optimize the system, but to understand what kind of system we're shaping through our choices.
The efficiency illusion depends on ecological blindness—seeing only certain flows while missing crucial relationships and feedback loops. But public institutions aren't machines to be tuned; they're living systems to be tended. The transformative potential of AI lies not in optimization, but in its capacity to strengthen the living systems that constitute public work.
How might your next technology decision change not just what gets done, but how people relate to each other and to the institution itself? What would it look like to measure success not by speed alone, but by the health of the entire ecosystem you're stewarding? And what kind of governance becomes possible when we stop chasing efficiency and start cultivating resilience?
The challenge ahead is translating these ecological insights into practical frameworks that can guide both procurement decisions and performance evaluation—ensuring that our tools serve not just our metrics, but our mission.
And this framing from Harvard Kennedy School Executive Education still applies: public managers are responsible, first and last, for creating public value. Our job is not merely shaving costs, but aligning tools to mission, trust, and legitimacy.
Next Issue
The third and final part of this series asks why AI succeeds or stalls. I'll introduce an "alternative pathways" lens (What if we didn't use AI?) and reframe pilots as ecological introductions that require stewardship, not just evaluation. This will require us to begin thinking through how to translate these ideas into governance language, decision thresholds, appeal routes, and stewards, to make them usable in policy work.
Picking your brain
- Have you ever protected a "friction point" in a process because it added fairness or legitimacy, even when others pushed to remove it? What happened?
- What's an example in your work where chasing efficiency risked undermining relationships and how did you navigate that tradeoff?
- In your organization, where have you found that "slowing down" actually built more trust with residents or staff?