Boards in the $30M–$500M range are being asked the same question right now:
"Can we move on AI?"
The pressure is understandable. Competitors are experimenting. Vendors are eloquent. Internal teams are already using it. The tooling barrier has collapsed.
The organizational barrier has not moved.
MIT's 2025 NANDA initiative found that roughly 95% of enterprise generative AI pilots deliver no measurable P&L impact. Industry data consistently shows that only a small fraction of proofs-of-concept reach durable production scale.
For a board, this is not an innovation statistic. It is a capital discipline statistic.
And the pattern is not new. It is forty years old.
ERP in the 1990s promised efficiency. Companies installed the system. Nobody redesigned the process. Departments kept spreadsheets alongside the enterprise platform. Not out of resistance. The old pathway was still faster. A new engine does not fix a confused driver.
CRM in the 2000s promised customer insight. Sales teams didn't enter data. Managers demanded reports anyway. The system existed. Truth did not. A memory system is useless if nobody tells it the truth.
Cloud in the 2010s promised modernization through relocation. Same tangled architecture, new address. The mess moved to a bigger room with a monthly invoice. Moving chaos does not remove chaos.
BI platforms promised clarity through dashboards. Structurally broken data flowed into well-designed visualizations. A weather map drawn from broken thermometers. Precise. Inherited. Wrong.
In each era, the technology functioned. The organisation surrounding it did not.
The technology changes every decade. The organisational failure mechanism has not changed once.
AI did not introduce a new category of failure. It compressed all the previous ones into a single decision cycle.
Building enterprise systems used to be difficult and slow. That slowness was unintentional governance. It forced organizations to define capabilities, map workflows, and design architecture before writing code, simply because the build cycle demanded it.
Generative AI removed that constraint. Consumer-grade accessibility met enterprise-scale consequences. A model gateway and retrieval pipeline can be stood up in weeks. No procurement cycle. No architecture review. No governance design.
AI did not create execution chaos. It removed the friction that previously slowed organizations from creating it.
A pilot is a greenhouse. Production is weather.
An AI pilot operates in controlled conditions. Clean data. Limited scope. No integration burden. No cross-functional accountability. It thrives. Production is different. Legacy systems. Regulatory obligations. Identity controls. Cost curves that behave differently at ten times usage. Human workflow friction that no demo anticipated.
The model works. The operating model is exposed.
Before approving scale, three questions matter.
Which operational metrics must move for this to justify capital? If AI cannot be tied to workflow-level performance shifts (handle time, underwriting throughput, cost per transaction), it is activity, not transformation. Activity does not survive a budget review.
What does the economic profile look like under production stress? Token consumption, orchestration, storage, monitoring, and human oversight rarely scale linearly. In a pilot, costs appear manageable. In production, they behave exponentially. If the board has not seen the cost curve, it is approving capital without visibility.
Is governance designed before expansion? Governance is the brake system on a performance vehicle. Brakes do not exist to slow you down. They exist so you can drive fast without hitting a wall. In regulated environments, whether under Australia's Privacy Act, Singapore's PDPA, or any cross-border data regime, retrofitting governance after scale is not a strategy. It is an admission that nobody designed one.
Most organizations start at experimentation.
Everything required for success comes before it.
Experimentation without capability definition, workflow mapping, or architectural principles is not a strategy. It is activity with a vendor contract.
Technology amplifies structure. It does not compensate for its absence.
A powerful tool deployed into structural ambiguity does not create faster progress. It creates faster confusion.
The question facing boards this year is not whether to move on AI.
It is whether the organization is structurally ready for production before capital is irreversibly committed.
That is the difference between an AI strategy and AI activity.
The execution gap is not new. The speed at which it compounds is.
Top comments (0)