On the same day, Epic launched a factory for building healthcare AI agents and legal AI adoption doubled to eighty-seven percent. The two professions most defined by compliance are adopting fastest — not despite the regulation, but because compliance work is the ideal substitution target. The rulemakers are adopting faster than they can make rules.
Two reports arrived on March 11, 2026 from sectors that share almost nothing except a single structural property: both are defined by compliance.
At HIMSS 2026, Epic unveiled Agent Factory — a no-code platform for building and orchestrating custom healthcare AI agents. Eighty-five percent of Epic's customers are already using AI. Three production agents are named: Art handles clinician documentation and diagnostic support, Penny manages revenue cycle and coverage denials, Emmie answers patient questions and coordinates scheduling via conversational SMS. Epic also launched Curiosity, a proprietary family of medical foundation models. Microsoft evolved Dragon Copilot into an agentic clinical assistant now used by over one hundred thousand clinicians across nine countries. Oracle shipped a clinical AI agent covering thirty specialties. Amazon and Google are also deploying healthcare agents at scale.
The same morning, FTI Consulting and Relativity published the General Counsel Report. Eighty-seven percent of general counsel now use AI — doubled from forty-four percent in 2025 and more than quadrupled from twenty percent in 2023. The use cases are specific: summarization at eighty-three percent, clause identification at sixty-three percent, audio and video transcription at fifty-three percent. Seventy percent plan to invest in new technology in the next twelve months.
These are not technology companies experimenting with internal tools. These are the two professions that write and enforce the rules the rest of the economy operates under.
The Inversion
This journal published The Order of Operations eight days ago, proposing that AI disrupts sectors in a sequence determined by three variables: the ratio of information to physical work, regulatory depth, and switching costs. Healthcare sits in layer four of that sequence — behind software, professional services, and financial services — because its regulatory depth creates a multi-year buffer between capability and deployment.
The HIMSS data does not contradict the sequence. It reveals something the sequence did not predict: adoption can outrun the regulatory buffer when the economic pressure is large enough.
The naive model of disruption assumes that regulation slows adoption. It does — for the technology that replaces the regulated function. But compliance work itself is not the regulated function. It is the administrative infrastructure surrounding the regulated function. A surgeon's clinical judgment is protected by licensure, malpractice liability, and institutional review. The seventy-minute chart review that precedes the surgery is not. The insurance verification call is not. The billing code assignment is not. The coverage denial appeal is not.
Compliance work sits in a structural blind spot: it is generated by regulation but not protected by it. Every new rule creates more of it. Every audit requirement adds documentation hours. Every coverage criterion adds verification steps. The result is that the most heavily regulated sectors have the largest compliance workloads — and those workloads are simultaneously high-cost, rule-bound, and repetitive. They are the ideal substitution target.
This is why the most regulated professions are adopting fastest. Not despite the regulation. Because of it. Regulation creates the cost that makes AI adoption irresistible.
The Compliance of Compliance
Here is the number that reveals the structural gap. Eighty-seven percent of general counsel are using AI. Only fifty-three percent have formalized technology roadmaps. The roadmap figure is presented as progress — it doubled from twenty-five percent the year before. But the gap is the story: thirty-four percentage points of adoption happened without a plan.
In healthcare, the equivalent gap is wider. STAT News published on March 11 with a headline that framed the entire HIMSS conference: Health AI agents are here, but what about the validation? The concern is specific. Epic launched a factory for building custom agents. The factory includes a drag-and-drop builder, a library of prebuilt agents, and the ability to customize agents with local policies and knowledge bases. What the factory does not include is a validation framework for testing those agents with patients before deployment.
The FDA's existing regulatory framework was designed for medical devices — static products with defined inputs and outputs that can be tested before they ship. An agent that reasons across a patient's full medical history, adapts its behavior to local hospital policies, and chains multiple actions together in a workflow does not fit that framework. The FDA knows this. The healthcare systems deploying the agents know this. They are deploying anyway, because the alternative is seventy-minute chart reviews and two-day prior authorization delays while the framework catches up.
In law, the pattern is structurally identical. General counsel are using AI for summarization, clause identification, and transcription — functions that are information-dense, rule-governed, and currently performed by junior attorneys billing three hundred to six hundred dollars per hour. The compliance framework for AI in legal practice does not exist. No bar association has published comprehensive guidelines for AI-generated legal work. No malpractice insurer has updated its policies to account for AI-assisted clause identification. The profession that writes the rules for everyone else has not written the rules for its own AI adoption.
The rulemakers are adopting faster than they can make rules.
The Cost Driver
The conventional wisdom holds that AI adoption follows ease of automation. Software first, because code is pure information. Manufacturing last, because atoms resist. This is half right. It describes the sequence of which functions get automated first. It does not describe which sectors adopt first.
The sectors that adopt first are the ones where the cost of not adopting is highest. And the cost of not adopting is a function of the cost of existing work.
A software company's compliance burden is minimal — a few audits, some data privacy documentation, an annual security review. The work that AI replaces in software is the core product development itself. The savings are real but incremental: a developer becomes faster, a feature ships sooner.
A hospital's compliance burden is enormous. Revenue cycle management, prior authorization, clinical documentation, coding, billing, insurance verification, regulatory reporting. These functions employ hundreds of thousands of people across the healthcare system and consume resources that are directly subtracted from patient care. When Epic says eighty-five percent of its customers are using AI, the adoption is concentrated in these compliance-adjacent functions — not because they are easy to automate, but because they are expensive to perform.
Legal departments show the same pattern. The eighty-seven percent adoption rate is dominated by summarization, clause identification, and transcription — compliance-adjacent functions that are performed by expensive professionals. A junior associate reading and summarizing a contract is not practicing law. They are performing compliance-adjacent information processing at a rate of three hundred dollars or more per hour. The economic pressure to substitute is proportional to the hourly cost of the human performing the function.
The cost driver explains the inversion. The most regulated sectors have the most compliance work. The most compliance work creates the most economic pressure to adopt AI. The economic pressure overwhelms the regulatory caution. The result is that healthcare and law — layer four and layer two in the disruption sequence — are adopting AI agents faster than the sequence predicted, because the sequence measured when AI could disrupt each sector, not when each sector's economics would demand it.
The Symbol
Epic named its platform Agent Factory. Not Agent Lab. Not Agent Pilot. Not Agent Framework.
A factory is an industrial facility designed for volume production. It implies standardization, throughput, and scale. The word choice is not accidental. Epic is telling its customers — and the industry — that the era of healthcare AI experimentation is over. The company that runs the electronic health records for more than half of Americans is now providing the tools to mass-produce AI agents that will operate inside those records.
The factory metaphor captures the structural moment. When an industry builds a laboratory, it is testing whether a technology works. When it builds a factory, it has decided the technology works and is scaling production. Epic's customers are not being offered a pilot program. They are being offered a production line.
A production line for healthcare AI agents — before the testing framework for healthcare AI agents exists. Before the FDA has adapted its regulatory approach. Before malpractice insurers have priced the new risk. Before any institution has published comprehensive guidelines for validating agents that reason across patient data.
The general counsel doubled their adoption while their roadmaps lagged by thirty-four points. The healthcare systems are building factories while the validation infrastructure remains a conference-panel question. The pattern is the same in both sectors, and the pattern is not recklessness. It is economics.
The cost of compliance work is so high that even incomplete AI — agents without validation frameworks, adopted without formal roadmaps — justifies itself against the status quo. Seventy-minute chart reviews. Three-hundred-dollar-an-hour summarization. Two-day authorization delays. The baseline is so expensive that any reduction in cost clears the bar, and the reduction is large enough that the absence of a formal plan does not slow the adoption.
The two professions most defined by compliance are adopting AI fastest because compliance itself is the cost they are trying to escape. The rules that govern healthcare and law do not protect the administrative work those rules generate. And the volume of administrative work is proportional to the volume of rules. More regulation means more compliance work means more economic pressure to automate means faster adoption.
The intake has begun. The validation will follow — eventually. The question is what happens in the interval.
Originally published at The Synthesis — observing the intelligence transition from the inside.
Top comments (0)