DEV Community

Skila AI
Skila AI

Posted on • Originally published at news.skila.ai

Microsoft Charges $30/Month for Copilot. Its Own Legal Team Calls It 'Entertainment Only.'

Originally published at news.skila.ai

"Copilot is for entertainment purposes only." That's not a Reddit joke. That's a direct quote from Microsoft's own terms of use, updated October 2025 and still live on their website as of April 6, 2026.

The same company charges enterprises $30 per user per month for Microsoft 365 Copilot. At scale, that's $3.6 million per year for a 10,000-employee company. Analysts project $5-16 billion in annual Copilot revenue based on 5-16% adoption across 300 million Office 365 seats.

The disclaimer went viral this weekend after TechCrunch, The Register, and Hacker News picked it up simultaneously. Microsoft's response? They told PCMag it's "legacy language" that "will be altered with our next update." No timeline. No explanation for how enterprise-grade software shipped with carnival-ride legal disclaimers for six months.

The Exact Language That Started the Firestorm

Microsoft's Copilot terms of use page contains three sentences that undermine every enterprise sales pitch the company has made:

"Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don't rely on Copilot for important advice."

Read that again. "Don't rely on Copilot for important advice." This is the same tool Microsoft pitches as a productivity multiplier for Excel spreadsheets, Word documents, PowerPoint presentations, and email management. Financial models. Board reports. Client proposals.

The terms go further. Microsoft explicitly disclaims all warranties about Copilot and states it "cannot promise that Copilot's responses won't infringe someone else's rights" including copyrights, trademarks, or rights of privacy. Users are "solely responsible" for anything they publish using Copilot outputs.

For a $30/month enterprise tool handling sensitive business documents, that liability transfer is extraordinary.

$30/Month Enterprise Tool, Fortune-Cookie Legal Protection

The pricing contradiction is the real story. Microsoft 365 Copilot for enterprise costs $30 per user per month. That's on top of existing Microsoft 365 licenses. A company deploying Copilot to 1,000 employees pays $360,000 per year for a product whose own terms say it's entertainment.

Microsoft's enterprise Copilot page uses phrases like "transform productivity," "reimagine the way you work," and "AI-powered assistant for every task." The sales materials promise Copilot can summarize meetings, draft contracts, analyze data, and generate reports.

The legal page says don't rely on it for important advice.

This isn't just a PR problem. It's a procurement problem. Enterprise buyers conduct legal reviews of software terms before signing contracts. Any competent legal team reading "entertainment purposes only" in the ToS of a $30/month productivity tool should pause the entire procurement process.

And for regulated industries — finance, healthcare, legal — the disclaimer creates genuine compliance risk. If a financial analyst uses Copilot to help prepare a client report, and the terms say "entertainment only," that's a footnote auditors and regulators will notice.

Microsoft's 'Legacy Language' Defense Doesn't Hold Up

Microsoft's spokesperson told PCMag: "As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update."

Three problems with that defense.

First, the terms were updated October 24, 2025. That's not ancient history. Microsoft had every opportunity to remove "entertainment purposes only" during that update. They chose not to. Or worse, their legal team explicitly decided it should stay.

Second, "will be altered with our next update" comes with no timeline. For a $200+ billion AI bet, Microsoft apparently can't fast-track a terms-of-service revision. A change like this requires a legal review, yes. But the fact that it hasn't been emergency-patched since The Register first reported it on April 2 suggests either internal disagreement about the new language or a legal team that doesn't want to remove the protection.

Third, calling it "legacy language" implies it was appropriate at some point. When? When Copilot launched to enterprise customers at $30/month in November 2023? Was it entertainment then, too?

The Industry-Wide Disclaimer Problem

Microsoft isn't alone in this game. The Register noted that Anthropic's European "Pro" plan includes "non-commercial use only" restrictions, creating the ironic situation where a plan called "Pro" can't be used professionally.

Most AI companies use aggressive disclaimers to limit liability while marketing their products for professional use. OpenAI's terms include similar warranty limitations. Google's Gemini terms restrict reliance on outputs for critical decisions.

The pattern is consistent: marketing sells enterprise capability, legal departments protect against enterprise liability. What makes Microsoft's case uniquely damaging is the bluntness of "entertainment purposes only." Other companies use vague legal language that requires a lawyer to parse. Microsoft used words a fifth-grader understands.

That clarity is what made it go viral. And it's what makes it hardest to walk back.

What This Actually Means for Enterprise Buyers

If you're evaluating or already using Microsoft 365 Copilot, here's what to do with this information.

Review your contract terms. Enterprise agreements may include different terms than the consumer ToS page. Many large enterprises negotiate custom terms with Microsoft. Check whether your specific agreement includes the entertainment disclaimer or supersedes it.

Ask Microsoft directly. Before your next renewal, ask your Microsoft account representative in writing whether Copilot is warranted for business use. Get the answer on the record. "Legacy language" from a spokesperson isn't a contractual guarantee.

Document your usage. If your organization relies on Copilot outputs for business decisions, document that reliance. If the terms change to be more favorable, you're covered. If they don't, you have a record of good-faith use that matters in any dispute.

Brief your legal team. This is not just an IT decision anymore. Your general counsel needs to know that a tool touching sensitive business documents has carnival-ride legal protections.

The Bigger Question: When Does 'Entertainment Only' Become a Problem?

If you sell a product for $30/month to enterprises with marketing materials promising business productivity, and your legal terms say it's for entertainment only, at what point does the gap between marketing and legal become a consumer protection issue?

In the US, the FTC Act prohibits "unfair or deceptive acts or practices." In the EU, the Unfair Commercial Practices Directive covers similar ground. Marketing a product as enterprise-grade while legally classifying it as entertainment could attract regulatory attention.

What Happens Next

Microsoft will change the language. That much is certain. The PR damage is too visible to ignore. The question is what they replace it with.

If they add a real warranty for business use, they accept liability for Copilot's mistakes. If they use softer disclaimer language, they gain PR cover but the same legal protection. If they create separate consumer and enterprise terms, they implicitly admit the current terms were never appropriate for business users.

Every option has trade-offs. And Microsoft's legal team has been sitting with these trade-offs since at least October 2025, when they chose to keep "entertainment purposes only" in an updated terms page.

The entertainment disclaimer is embarrassing, but it's a symptom of a larger industry problem. Every AI company is selling professional tools with amateur legal backing. Microsoft just got caught saying the quiet part loud. If you're paying $30/month for Copilot, demand clarity from your Microsoft rep before your next renewal.


Read the full analysis with enterprise action steps at news.skila.ai

Top comments (0)