
A new $120 MM bipartisan philanthropic effort to recode American government recently launched. AI agents continue to evolve. Public processes, particularly in places like California, continue to plug along, steadily and sometimes almost unchangingly zombie-like throw backs to an older era in a rapidly changing world.
What are useful applied research questions as government operations goes through a massive system upgrade, an epochal change on the order of the industrialization of America. That time saw a bipartisan reform effort that led to what we now know as the civil service, professional budgets and the administrative structures we still depend on today. It also produced the layers of process that many now see as barriers to progress.
As we look to address pressing public problems like inequality of opportunity, the ongoing climate crisis, the increased risk of nuclear oblivion, growing geopolitical upheaval, I put together a list of applied research questions ranging from the atomic to the systemic that I've been wrestling with. My hope is that they're useful as we work to pioneer new modes of government operations that can meet those mountains.
What are the specific steps in a typic permit application from intake to close and how do the procedures differ in Los Angeles and Salt Lake City?
Where do frontline staff create unofficial workarounds—and why?
Which decisions are made at the counter, which are escalated, and how long does escalation take?
What design features make a permitting system or other public process self-correcting rather than self-compounding in procedural sludge?
What are the repeatable affordances for AI to help achieve focused process objectives like fairness, time to completion, and or external outcomes like environmental stewardship, good business practices etc?
What is the first step in making open data MCP compliant?
What about other public processes like requesting engineering as builds? Fee schedules? Permit applications?
Does the absence of clear sunset clauses or kill criterion for processes within a large organization like a corporation or public agency lead to kafka index increases?
How do residents describe needs differently than how agencies classify them?
How might bounty programs be deployed to address procedural bloat?
What about micro-bounties for journey mapping specific public procedures and writing blog posts? Something simple like this one.
Which California Performance Review recommendations succeeded? Why or why not? Which are still worth implementing in the next gubernatorial administration?
What are the lessons of DC's city GitHub experiment? Would that early experiment be strengthened by a "right to refactor" public processes through an official pull request? Should residents be able to “read” and “debug” their government’s processes just like lawyers read laws or analysts decipher budgets?
What have been the successes of France's public sector digitization efforts? How do those compare to UK's digital government? What about other international examples?
How might we make peer to peer knowledge sharing between public servants easier? There are a bunch of online forums. I have participated in some international networks that work through simple facilitated video calls. How might we make that type of knowledge sharing more routine?
How might we nurture planetary scale polycentric governance models for the new city operating system?
What would a model city charter for 2030 look like? Who is putting that together?
What are the key components of process time? What are the bottlenecks?
How might tools like STARA be open sourced and redeployed at scale for every municipality across California?
Is it possible to inventory all the potential permits required for specific construction projects like housing or public infrastructure across California?
What is the resident’s cognitive time cost in hours for each service pathway?
Can we forecast time-to-completion at intake with confidence intervals?
What is the “uncertainty premium” a user pays when there is no clear end date?
What are historical examples of addressing procedural sludge without a major war or revolution? What can we learn from those today?
What internal protocols have succeeded in effective internal data sharing within public agencies?
What about securely sharing valuable public data between public agencies and academia for applied research? Or sharing critical information for emergency response?
How might useful digital public goods be more routinely repurposed globally? Here I am thinking of tools like potentially Estonia's X-Roads and similar.
For one service, can we enumerate the full rule set and mark which are statutory, regulatory, policy, or habit?
Can that enumeration be codified into a protocol that can be requested for any or at least many rules sets?
What fraction of rules are decision thresholds that could be published as machine-readable checklists? How might those be made MCP compliant?
How might we build a canonical rule registry with provenance and change history?
How much mission critical public information is still in paper filing cabinets?
How might a public service year type of fellowship help provide digital transformation capacity to every municipality and quasigovernmental NGO in a state like California? What would it mean to spin up a fellowship stitching together 10,000 fellows so that every local municipality, JPA, and random arcane quasigovernmental institution has an energetic, digitally talented fellow?
What are three no regrets steps every municipal manager in California might take today to prepare for an agentic AI future as it relates to gov ops?
Does a protocol having a certain amount of legitimacy or legibility also lead to other qualities? Are for example more generative protocols typically more learnable or ludic?
How do power asymmetries manifest in protocol design? When does too much participant control (the Bartleby condition) or too little (the Kafka condition) degrade collective outcomes?
What can we learn from how great public works of the past were envisioned, designed, procured and ultimately delivered? What lessons can we take for today's challenges to work through the procedural bloat?
How might citizen's assemblies help galvanize support for structural change at the level of consolidating obsolete municipalities, charter revisions, etc?
What would public engagement that is designed from the internet up look like today? How might AR be instituted by default in physical construction planning and community engagement? How might statistically valid samples of community sentiment be conducted? How might citizens share their voice beyond the standard two minutes public comment of fame?
How might AI and other statistical tools help electeds and managers make sense of the incredible variety and volume of information sharing citizen sentiment?
How might this wave of public sector reform get past the tipping point of shiny pilots to make obvious improvements that make 80+% of residents notice and say "yes that's clearly better?"
Where is the bureau of municipal research for today's epochal shift in government operations? What might an internet native applied R&D effort look like there? How might such an effort redeploy successful models like the Arc Institute's principal investigators, open source software and network of contributing researchers?
Please share your thoughts, reactions, missives, case studies, ponderances and reflections in the comments! Thanks much.
Epigraph:
Bring me men to match my mountains,
Bring me men to match my plains,
Men with empires in their purpose,
And new eras in their brains.
Needs some more general human protocols though that poem at the State Capitol does offer a good sentiment to increase our ambition!
Share Dialog
No comments yet