How Assured automated regulated tunnel reporting without losing the senior judgment that mattered.
In twelve months, Assured Environmental went from drafting tunnel air-quality reports by hand to running a substrate of expert systems across all twelve tunnels. Drafting time fell from 14 hours per report to 1 hour of senior review. Adoption was 100%, voluntary. The system runs under the senior reviewer's name on every cover page.
This case study describes the founding deployment of HYPRCORP's expert-systems substrate. The work began with a single tunnel, a Phase 1 demo, and a question from Assured's MD about whether the team could keep up. The answer reframed the question.
A senior team, a regulator, and a reporting bottleneck.
Assured Environmental holds the monitoring contracts for twelve road tunnels across two states. Every month, each tunnel generates a 32-page report — twelve narrative sections, twenty-four tables, twenty-three chart figures — going to the operator and ultimately to the regulator. The data is right. The expertise is right. The bottleneck was the assembly.
Senior environmental scientists were spending fourteen hours per tunnel, every month, stitching exports from Airodis, CEMS maintenance logs, and lab feeds into Assured's monthly Microcosm template. Multiplied across the team, that's a senior FTE-equivalent every month spent on data manipulation rather than the analytical judgment regulators were paying for.
Worse, the bottleneck was getting harder. New tunnels added scope. New regulatory tweaks meant template updates. The team had quietly absorbed every recent expansion as personal overtime — and the senior team's time was the constraining resource on Assured's growth.
In April 2026, Assured's MD Dave Arbuckle asked HYPRCORP a deceptively simple question: could we automate the report drafting without losing what made the reports good?
Phase 1: one tunnel, end-to-end.
HYPRCORP's response wasn't a six-month build. It was a seven-day demo: take the actual files Assured had sent through that week — Airodis xlsx, CEMS xlsx, the Microcosm template — and produce a fully drafted report end-to-end. RIC March 2026, one of the trickier tunnels. Twelve narrative sections, twenty-four tables, twenty-three chart figures, machine-time of approximately two minutes.
Dave's response after reading the draft: "I want to know how much it would cost to hire Dom."
That was the inflection point. The capability was proven. The structural conversation that followed — between Assured, HYPRCORP, and Subfracture — settled on a different shape than employment: founding-customer engagement, retainer for ongoing extension, and a JV vehicle (Subfractal) wrapping co-built IP for licensing into Assured's wider client base.
The Phase 1 build-out across the remaining eleven tunnels took six weeks. Phase 2 — exclusion handling, confidence intervals, audit trail — took ten. By month four, all twelve tunnels were running through the substrate. By month nine, the only human-in-the-loop work on the reports was the senior reviewer's final pass.
I started this engagement asking whether HYPRCORP could replace fourteen hours of stitching. They built something that did that — and then they sat with the team while we figured out what our work had become. That second part is what I didn't know to ask for. It's what I now insist on telling other firms about.
Time, quality, and adoption — all moved.
The headline metrics tracked — and a few that don't usually appear in reporting-automation case studies, because most reporting-automation projects don't measure them.
What this looked like to one person.
For 14 years I'd written these reports by hand. The first month I felt diminished — like the system was doing what I used to do. By month three I'd realised: I'm not writing reports any more, I'm setting the standard the system writes to. My quality bar is the differentiator. My review is the regulator-facing signature. The thing that moved was never the technology.
Composited from common patterns observed across the engagement. Reflects the role-redefinition trajectory of senior reviewers across the team during months 1–3 of pastoral-care delivery.
Expert systems, not reporting templates.
Built on DSPy.
The pipeline is composed of typed signatures and expert-system reasoners — declarative reasoning structures that learn from Assured's own historical reports rather than depending on hand-written rules.
What that means in practice: when a column moves in the Airodis export, the system doesn't break — it adapts. When a regulatory tweak changes a narrative section, the system tunes against the new corpus rather than requiring a re-write.
Domain expertise stays embedded.
The reasoning structures encode Janet's twenty-year judgment, Sam's edge-case patterns, the team's collective handling of NMHC vs VOC labelling. Senior expertise is the substrate's training data. The system is Assured-shaped, not generic.
That's also what makes it portable. The substrate generalises across regulated reporting domains — water, contamination, sustainability, mining, workplace health. Each new domain reuses the architecture; only the signatures and corpora change.
The shape of the engagement, looking back.
Three things that aren't typical of AI deployments.
The methodology was reusable from day one.
HYPRCORP didn't build a tunnel-reporting tool — they built an expert-systems substrate that happened to do tunnel reporting first. Generalising to new tunnels and new domains was the design, not an afterthought.
Pastoral care was a line item.
Adoption was named as the metric and given a delivery model. Each affected senior reviewer had a named pastoral-care contact at HYPRCORP for twelve months. The work most consultancies don't price was priced. Adoption was 100%, voluntary.
Founding-customer status was structural.
The relationship wasn't vendor-and-customer. It was co-investing parties building a sector reference together. The case study rights were named pre-build. The JV vehicle was scoped during Phase 1. Both parties compounded.