The idea that we are currently in an AI “compute bubble” is gaining a lot of traction. The skeptical view is that we’re building a hundred-lane highway for a town of fifty people, that we’ve already bought more chips and built more data centers than we could ever possibly use.
The logic seems sound on the surface: once you’ve used AI to write your emails, summarize your meetings, and generate a few pictures of a cat in a tuxedo, how much more “intelligence” do you really need?
But this argument assumes our current list of tasks is all there is. It ignores the fact that for most of human history, we haven’t been “finishing” our work. We’ve just been settling for a very blurry, low-resolution version of it, because the cost of actually doing the job right was simply too high.
The header image is a LiDAR hillshade of the Carson Mounds complex in Mississippi. Decades of agricultural land-leveling had reduced the 88 prehistoric earthworks to barely detectable rises in the ground. Airborne LiDAR processed millions of elevation measurements into a continuous surface model, confirming the full layout of the complex where conventional ground survey no longer could. That is the kind of problem this computing shift is built for.
The Jevons Paradox: Filling the Lanes
In the 1860s, an economist named William Stanley Jevons noticed something odd. As steam engines became more efficient, meaning they could do the same work with less coal, everyone assumed coal consumption would drop. Instead, it skyrocketed.
When you make a resource cheaper and more efficient, it stops being a luxury for a few big things and starts being a necessity for everything. We saw this with digital storage. When a gigabyte cost a fortune, we saved text. When it cost pennies, we started saving 4K video. We didn’t “save” space. We invented new, massive ways to fill it.
The data center build-out isn’t a bubble, and the reason comes down to one fundamental shift: we are moving from a Sampling Economy to a Total Data Economy.
We aren’t building these centres to do the old tasks faster. We are building them because we are about to stop sampling the world and start actually measuring it.
Archaeology: Moving from Guessing to Knowing
Archaeology is a perfect case study for this shift, and it’s one we live with every day at Green Spring Research.
Right now, archaeological work is largely a science of compromise. Because human labour and expert analysis are expensive, practitioners typically study a small fraction of the disturbed soil on a given project and make an educated inference about the rest. That’s not negligence. It’s the rational response to limited resources. We’ve lived in a Sampling Economy because we had to.
But in a world of abundant compute and autonomous systems, the standard of “done” changes entirely:
- Better Planning: We can use remote sensing, LiDAR, and AI-driven predictive modelling to see through the ground before we ever move a shovel, designing projects that avoid sensitive areas entirely rather than discovering them after the fact.
- 100% Soil Analysis: Instead of studying a fraction of disturbed sediments, intelligent screening workflows allow for the processing and analysis of 100% of the material in a project area, not just the parts we had time to look at.
- Total Collection: Every gram of earth can pass through an AI screening system capable of identifying every artifact, feature, and stratigraphic detail with a level of consistency no human team could match at scale.
In this scenario, we aren’t using less compute. We are using orders of magnitude more, to move from “a reasonable guess” to “total certainty.” And once you can actually afford to know the whole story of a site, settling for a partial picture starts to look like something else entirely.
The Office Side of the Same Equation
The field is only half the story. The same logic applies the moment the field crew walks out of the bush.
The report, which is the compliance product that regulators, First Nations, and clients actually receive, is still assembled largely by hand. Field notes get transcribed, GPS data reformatted, boilerplate copy-pasted between documents. A single report can take weeks to months to finalize. In BC, there already aren’t enough qualified archaeologists to meet current demand, and the reporting burden makes the shortage worse.
This isn’t unique to archaeology. Across every compliance-heavy industry, environmental assessment, infrastructure permitting, legal filings, health and safety documentation, the same bottleneck exists: skilled professionals spending a disproportionate amount of time on assembly work that requires precision but not judgment.
This is where AI pays off at the office level. Not by replacing the expertise that fieldwork requires, but by automating the assembly layer: structured data pipelines, template-based document engines, validation before submission. I wrote about building exactly this kind of system, and why the naive approach of just handing an AI a report template fails, in a previous post.
Automation doesn’t reduce demand for compute. It removes bottlenecks that were artificially capping throughput. Faster reports mean more projects processed, more data generated, more compute needed downstream. The field and the office are both part of the same Jevons loop.
And archaeology is just one visible example. Legal document review, financial compliance filings, engineering permit packages, insurance assessments: the knowledge-work layer in every regulated industry is still largely manual. That’s an enormous amount of latent compute demand waiting to be unlocked.
From Crumbling Infrastructure to Future Frontiers
This shift, from sampling to total processing, is the future of every knowledge-intensive industry. We are finally acquiring the tools to move past “good enough” for both the structures we already have and the ones we haven’t even imagined yet.
| Industry | The Sampling Economy (Now) | The Total Data Economy (Future) |
|---|---|---|
| Infrastructure | A technician with a clipboard inspecting a crumbling bridge once a year. | Sensors on every bolt and joint, analyzing stress in real-time to prevent failure before it happens. |
| Environmental Assessment | Weeks in the field trying to count birds, fish, and elusive amphibians across a survey area. | 24/7 autonomous monitoring of every biological variable in a zone, with continuous, documented accuracy. |
| Manufacturing | Statistical batch testing, hoping the 999 units you didn’t test are fine. | A digital twin for every single unit produced, simulated through its entire operational life. |
| Regulated Compliance | A consultant manually assembling reports from field notes, GPS files, and scattered PDFs, often weeks after fieldwork, with errors caught at submission. | Structured data pipelines feeding validated inputs into templated document engines. Reports generated same-day. Errors caught before they reach a regulatory body. |
| Future Construction | Building based on probability distributions and “standard” safety margins. | Atomic-level precision in design and material use, producing structures engineered to last indefinitely. |
The Long-Term Reality
It is entirely possible that in a hundred years, intelligent systems will be capable of performing almost any manual or analytical task. We might eventually “solve” the physical engineering of our planet.
But we are nowhere near that ceiling.
We have enormous volumes of ground to understand, millions of kilometres of aging infrastructure to monitor, and decades of manual knowledge-work bottlenecks finally ready to be automated. The “bubble” isn’t going to burst because the underlying work is nowhere close to finished. We are only now acquiring the tools to actually start it properly.
As long as there is a gap between a sample and the whole truth, we’re going to use every data centre we can build.
Pat McCashin is a Registered Professional Consultant in Archaeology (RPCA) and the principal of Green Spring Research Inc., a BC-based firm specializing in archaeological and heritage resource management for major resource and infrastructure projects.

