Over 200 subscribers
It’s obvious to anyone not pretending otherwise: the consultant-industrial complex that props up modern government is broken. Not broken like a vase fallen from a shelf—broken like a machine still running but long since severed from its original design, whirring and wheezing under the weight of obsolete incentives and institutional amnesia.
Governments don’t need more 90-page PowerPoints from firms who bill by the hour and dilute their insights into the safety of jargon. What they need—what we need—is a Civic Data Science Marketplace: a living, breathing agora of ideas, protocols, and precision-crafted insight matched to real problems in real time. Not a vendor list. A living protocol. Not RFPs written in codewords from the last century. But something fresh, dynamic—an orchestration of minds, machines, and the messy wisdom of lived civic experience.
We are already living in the epoch of the digital tsunami—wave upon wave of open datasets, statistical methods once reserved for ivory towers, and computational firepower now pocket-sized. And from these waves emerges a new form: a marketplace not of goods, but of sensemaking.
What’s missing is not the talent. The talent is here. It’s on GitHub, Warpcast, underemployed in academia, writing Substacks in exile from institutions that wouldn’t fund a moonshot if it came wrapped in a Gantt chart. What’s missing is the connective tissue—protocols that let these minds meet public needs with radical specificity.
This is where AI enters—not as a terminator, not as a savior—but as a matchmaker and enabler. Imagine not another dashboard, but a civic oracle: you, a municipal manager with a groundwater compliance conundrum; the algorithm, whispering back, “Here’s Maria, who built a machine-learning model on well logs in Kern County and has three ideas to bootstrap from.” No bureaucracy. No twelve-month procurement cycle. Just insight, surfacing at the speed of thought.
To build this requires not a skunkworks, not a Manhattan Project but a decentralized ecosystem. But where the original cloistered genius in desert labs, this version disperses it across a protocolized agora—call it the Evil Morty Protocol.
In the show Rick and Morty, Evil Morty dares to question the Citadel—the place where all Ricks believe they are the smartest in the multiverse. But it’s a citadel built on sameness, on recursive self-reference. Sound familiar? Our civic systems are riddled with their own Citadels—echo chambers of stale expertise. Evil Morty doesn’t just rebel. He builds a path out. That’s what this marketplace is: not a rebellion against expertise, but its liberation.
This is not the same as the AI mega-labs scrambling for AGI under the watchful gaze of investors and compliance teams. No, this is something trickier, weirder—more alive. A public infrastructure project, yes—but one forged in the aesthetic of web3: decentralized, composable, credibly neutral. As Stephen from Interfluidity writes in his sharp post on institutional design for AI and open science:
Ultimately, we should want to generate a reusable, distributed, permanent, and ever-expanding web of science, including conjectures, verifications, modifications, and refutations, and reanalyses as new data arrives. Social science should become a reified public commons. It should be possible to build new analyses from any stage of old work, by recruiting raw data into new projects, by running alternative models on already cleaned-up or normalized data tables, by using an old model's estimates to generate inputs to simulations or new analyses.
This is precisely the insight for civic econometrics in the age of open data. Rather than centralizing analysis into stale contracts and rigid bureaucracies, we must create protocols that reward sharp, timely contributions—from anywhere. The Civic Data Marketplace is how we turn open science into open service.
It becomes a marketplace of discovery, not compliance. Of catalytic iteration, not bloated reports. It is the inverse of bureaucratic paralysis—it's a jazz improvisation session where each solo leads to the next modulation in key.
This is where Second Foundation work begins: not in a grand Manhattan style project, but in the hidden corners where real people meet strange public problems and don’t know who to call. Now they will. Now the call can be answered—not by the same consultants slinging the same psuedo solutions, but by companions in the work.
Stephen wrote that vision in 2014 and said technologically such a vision increasingly was possible with cryptography and the maturing web. Today all the pieces are in place. The question is whether we will have the boldness to build it, not just the tech but pioneer the experiments and protocols that will help transform government operations from the inside out.
Seeding the Second Foundation Series
@patwater
🚏🌪️ The US is spending $1 trillion a year on climate damage A new analysis from Bloomberg reveals that the US spent $1 trillion—a full 3% of GDP—on “disaster recovery and other climate-related needs” in the 12 months from May 2024 to May ‘25. Skyrocketing insurance costs (which have doubled since 2017), power outages, and post-disaster repair are major drivers. (Interestingly, scientists estimate that many geoengineering schemes would cost just a few billion dollars a year, which makes one wonder whether a single insurer or country could run a geoengineering project that benefits everyone and still make a profit individually.)
This makes me wonder what the current law/liability around geoengineering even is… has anyone regulated it yet?
Definitely… see the pushback on ocean algae seeding and various other experiments. Not always a regulation of geoengineering per say so much as oceans and other normal regs Also lots of international geopolitical angles with winners / losers to geoengineering
Thanks, that makes sense… this geoengineering topic is pretty high on my list of things that I hope super AI can help us understand the consequences of... it’s so mindbendingly complex to try and forecast this stuff…