<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: LLM Radar</title>
    <description>The latest articles on DEV Community by LLM Radar (@llmradar).</description>
    <link>https://dev.to/llmradar</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/llmradar"/>
    <language>en</language>
    <item>
      <title>The open-weight licence trap: Apache 2.0 vs. the community-licence model</title>
      <dc:creator>LLM Radar</dc:creator>
      <pubDate>Mon, 20 Apr 2026 11:49:01 +0000</pubDate>
      <link>https://dev.to/llmradar/the-open-weight-licence-trap-apache-20-vs-the-community-licence-model-5dej</link>
      <guid>https://dev.to/llmradar/the-open-weight-licence-trap-apache-20-vs-the-community-licence-model-5dej</guid>
      <description>&lt;p&gt;&lt;em&gt;Why "Llama community" and "Gemma terms" aren't the same as Apache 2.0, and why European procurement is starting to notice&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Welcome to &lt;a href="https://www.llmradar.eu/" rel="noopener noreferrer"&gt;LLM Radar&lt;/a&gt;. Independent evaluation of API providers and open-weight models against European hosting, GDPR and licensing requirements. Updated monthly, sourced from official documentation.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I've been tracking open-weight model licences on LLM Radar since early 2025, and for a long time I treated "open-weight" as a single category. It isn't. The category collapsed under its own weight over the past twelve months, and the consequences are now showing up in European procurement decisions, quietly but with real money attached.&lt;/p&gt;

&lt;p&gt;The short version: Meta's Llama Community License and Google's original Gemma Terms of Use look permissive until you read them. They aren't Apache 2.0. The gap matters more every quarter, and the market is starting to price it in.&lt;/p&gt;

&lt;h2&gt;
  
  
  What the community licences actually restrict
&lt;/h2&gt;

&lt;p&gt;The Llama 4 Community License Agreement, &lt;a href="https://www.llama.com/llama4/license/" rel="noopener noreferrer"&gt;published by Meta in April 2025&lt;/a&gt;, is the clearest case. Section 2 carves out a "700 million monthly active users" threshold above which commercial use requires a separate negotiated licence with Meta. Section 1.b.i requires any derivative AI model to begin its name with "Llama" and display "Built with Llama" in product documentation. Section 5.a reserves all trademark goodwill from that required naming to Meta. The Acceptable Use Policy is incorporated by reference and, per Meta's &lt;a href="https://www.llama.com/faq/" rel="noopener noreferrer"&gt;own FAQ&lt;/a&gt;, continued use of the model after a policy update constitutes acceptance of the new terms.&lt;/p&gt;

&lt;p&gt;That last point is the one most teams miss. Shuji Sado, a Japanese open-source licensing lawyer, laid this out in a &lt;a href="https://shujisado.org/2025/01/27/significant-risks-in-using-ai-models-governed-by-the-llama-license/" rel="noopener noreferrer"&gt;detailed January 2025 analysis&lt;/a&gt;: the Llama License is functionally a bilateral commercial contract under California law, not a conventional copyright licence. The contractual propagation extends to downstream users of derivative models, and the obligations exceed what Meta's underlying IP rights would allow on their own.&lt;/p&gt;

&lt;p&gt;There's a specific EU-relevant restriction that gets less coverage than it should. Llama 3.2's multimodal variants explicitly exclude EU-domiciled individuals from using the weights directly. Per Meta's own FAQ, an EU-based employee can use the models within the scope of employment for a non-EU company, but not for their own purposes. Companies can licence. Individuals cannot.&lt;/p&gt;

&lt;p&gt;The original Gemma Terms of Use had a different problem. &lt;a href="https://wcr.legal/google-gemma-license-risks/" rel="noopener noreferrer"&gt;Oleg Prosin at WCR Legal wrote in March 2026&lt;/a&gt; that no OSI-approved open-source licence grants the licensor the right to unilaterally modify terms after release or to terminate the licence of a downstream user. The Gemma terms reserved both rights for Google. In practical terms, Google could "restrict (remotely or otherwise) usage" it believed violated the Prohibited Use Policy. No Apache 2.0 licensor has ever claimed that power.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why this matters for EU deployment
&lt;/h2&gt;

&lt;p&gt;I ran into this the hard way when evaluating vendors for European public-sector work. A CTO at a French bank pointed out something I'd been underweighting: when a bank buys a vendor product, legal reviews the licence. Apache 2.0 takes hours. A custom AI licence takes weeks, because novel licence frameworks need line-by-line analysis.&lt;/p&gt;

&lt;p&gt;Nick Vidal, head of community at the Open Source Initiative, put it bluntly in &lt;a href="https://techcrunch.com/2025/03/14/open-ai-model-licenses-often-carry-concerning-restrictions/" rel="noopener noreferrer"&gt;TechCrunch in March 2025&lt;/a&gt;: "The restrictive and inconsistent licensing of so-called 'open' AI models is creating significant uncertainty, particularly for commercial adoption." Han-Chung Lee, director of machine learning at Moody's, told the same outlet that custom licences make models "not usable" in many commercial scenarios.&lt;/p&gt;

&lt;p&gt;The European angle sharpens this. The French Ministry of the Armed Forces, through AMIAD (Agence ministérielle pour l'intelligence artificielle de défense), &lt;a href="https://www.gend.co/blog/mistral-ai-french-defence-framework" rel="noopener noreferrer"&gt;signed a framework agreement with Mistral on 2026-01-08&lt;/a&gt;, hosted on French infrastructure. SAP and Mistral &lt;a href="https://news.sap.com/2025/11/sap-mistral-ai-new-alliance-european-sovereign-ai/" rel="noopener noreferrer"&gt;announced a sovereign AI partnership&lt;/a&gt; at the Franco-German EU Summit on Digital Sovereignty in November 2025. All of Mistral 3, released on 2025-12-02 per &lt;a href="https://mistral.ai/news/mistral-3" rel="noopener noreferrer"&gt;Mistral's own announcement&lt;/a&gt;, including the 675B-total-param Large 3 mixture-of-experts model, ships under Apache 2.0.&lt;/p&gt;

&lt;p&gt;That licence posture wasn't incidental to the French defence deal. It was part of the underwriting. Arthur Mensch, Mistral's CEO, &lt;a href="https://aibusiness.com/foundation-models/mistral-pioneers-sovereign-ai-in-europe" rel="noopener noreferrer"&gt;told the AI Impact Summit in India in February 2026&lt;/a&gt; that customers need the ability to "turn on and turn off" their AI workloads without depending on an external provider. Apache 2.0 makes that possible in a way the Llama licence structurally cannot.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Gemma 4 concession
&lt;/h2&gt;

&lt;p&gt;The most interesting move came from Google. Gemma 1 through 3 shipped under the Gemma Terms of Use, with Prohibited Use Policy flow-down and unilateral restriction rights. Gemma 4, &lt;a href="https://www.mindstudio.ai/blog/what-is-gemma-4-apache-2-license-commercial-ai-deployment" rel="noopener noreferrer"&gt;released in April 2025&lt;/a&gt;, shipped under Apache 2.0. No prohibited-use appendix. No revocability. No flow-down obligation to downstream users.&lt;/p&gt;

&lt;p&gt;Google did not frame this as a concession. The release notes treated it as a matter-of-fact detail. But from a procurement perspective it reads as Google accepting that the previous Gemma terms were a friction point enterprise legal teams wouldn't absorb. The 31B variant &lt;a href="https://dev.to/techsifted/google-gemma-4-review-2026-apache-20-license-benchmarks-commercial-use-3iea"&gt;reportedly scores 89.2% on AIME 2026 and 84.3% on GPQA Diamond&lt;/a&gt;, which means Google is now offering frontier-ish open weights with no licensing catch. Meta has not matched this move for Llama 4.&lt;/p&gt;

&lt;h2&gt;
  
  
  Who's defending the community-licence model
&lt;/h2&gt;

&lt;p&gt;In fairness, Meta's own position is that the Llama Community License is "a bespoke commercial license that balances open access to the models with responsibility and protections in place to help address potential misuse." Meta's FAQ explicitly defends the 700M MAU threshold, the naming requirement, and the Acceptable Use Policy as reasonable guardrails on a product Meta makes available at no cost.&lt;/p&gt;

&lt;p&gt;That's a coherent position, and for very large users it may be the right one. But it's a position that assumes the licensor-licensee relationship continues indefinitely on Meta's terms. For a European public-sector buyer planning a ten-year deployment, that's a structural dependency. For a regulated-industry buyer facing auditors, it's an ongoing review burden. For a startup building a product brand, the Llama prefix requirement alone is probably disqualifying.&lt;/p&gt;

&lt;h2&gt;
  
  
  What to watch next
&lt;/h2&gt;

&lt;p&gt;Three signals worth tracking over the next six months.&lt;/p&gt;

&lt;p&gt;One, whether Meta adjusts the Llama 5 licence. If Meta keeps the community-licence structure while Google and Mistral ship Apache 2.0 at the frontier, European procurement will continue routing around Meta, regardless of benchmark performance. Mark Zuckerberg's &lt;a href="https://about.fb.com/news/2024/07/open-source-ai-is-the-path-forward/" rel="noopener noreferrer"&gt;July 2024 open letter&lt;/a&gt; positioned openness as Meta's strategic lever. The community licence is the contradiction.&lt;/p&gt;

&lt;p&gt;Two, whether the EU AI Office's GPAI Code of Practice disclosures create any pressure on licence form. The Code of Practice doesn't regulate licences directly, but mandatory training-data summaries and copyright-compliance reporting interact oddly with licences that restrict how outputs can be used to train other models.&lt;/p&gt;

&lt;p&gt;Three, whether any Meta enforcement action against a Llama-derived product actually materialises. I've found extensive commentary on revocability risk and almost zero documented enforcement. Until there's a case, procurement is making decisions on theoretical exposure. That's enough to move money, but a test case would clarify a lot.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>llm</category>
      <category>machinelearning</category>
      <category>opensource</category>
    </item>
  </channel>
</rss>
