DEV Community

Patrick Botkins
Patrick Botkins

Posted on

Why AI coding tools generate ugly Flutter apps (and how to fix it)

Ask Claude Code, Cursor, or Copilot to "build me a settings page in Flutter" and you already know what you'll get: a ListTile inside a Scaffold inside a Card, styled with whatever ThemeData.light() hands out. It'll work. It'll also look exactly like every other Flutter tutorial from 2019.

This isn't the model's fault. It's a problem of defaults, training data, and missing context — and once you understand the failure mode, it's surprisingly easy to fix.

Why this happens

Large language models produce code by sampling patterns they've seen. For Flutter, "patterns they've seen" means millions of GitHub repos, StackOverflow answers, and official docs — the overwhelming majority of which use:

  • Card, ListTile, AppBar with default elevation
  • Hardcoded EdgeInsets.all(16) spacing
  • Colors.blue, Colors.grey[300], TextStyle(fontSize: 14)
  • Material 3 primary/secondary tokens straight out of the box

When you prompt an AI to "build a settings page," it samples the most statistically likely Flutter code for that request. The most likely answer is the most generic answer. That's how you end up with apps that technically work but look like they were built by the same intern who built the last five.

There's a second, subtler issue. The model has no awareness of your design system. It doesn't know your spacing scale, your type ramp, your radius values, your shadow treatments. Even when you ask nicely — "make it look modern" or "use soft shadows" — the model has to guess, and guessing produces inconsistency across screens.

The real fix: design tokens as a contract

Before you reach for any AI tooling, the right abstraction for this problem is design tokens. If you've worked in a mature design system, you already know the pattern: instead of writing EdgeInsets.all(16) you write EdgeInsets.all(LustreTokens.spacing16). Instead of Color(0xFF3B82F6) you write LustreTokens.accentPrimary.

The value isn't aesthetic. It's that tokens turn design decisions into a contract that any code — human or AI — can reference.

class LustreTokens {
  // Spacing scale (8pt grid)
  static const double spacing4  = 4.0;
  static const double spacing8  = 8.0;
  static const double spacing16 = 16.0;
  static const double spacing24 = 24.0;
  static const double spacing32 = 32.0;

  // Radius scale
  static const double radiusSm = 8.0;
  static const double radiusMd = 12.0;
  static const double radiusLg = 20.0;

  // Type weights
  static const FontWeight fontRegular  = FontWeight.w400;
  static const FontWeight fontMedium   = FontWeight.w500;
  static const FontWeight fontSemibold = FontWeight.w600;

  // Animation
  static const Duration durationFast = Duration(milliseconds: 180);
  static const Curve curveStandard  = Cubic(0.2, 0.0, 0.0, 1.0);
}
Enter fullscreen mode Exit fullscreen mode

Once tokens exist, every component in your app pulls from the same source. Change spacing16 to 18 and the whole app breathes a little more. Change radiusMd to 14 and every card softens in lockstep.

But here's the thing nobody writes about: this is exactly what AI coding tools need.

The missing piece: giving the AI your tokens

The limitation of design tokens in isolation is that an AI assistant has no way to know they exist. You can paste them into context every time, but that's fragile, expensive in tokens, and falls apart the moment you switch to a new session or a new file.

What AI coding tools need is a persistent way to look up your design system on demand. That's what the Model Context Protocol (MCP) is for. An MCP server is a small process your AI tool can query — "give me the spacing scale," "give me the button component," "show me the dashboard layout" — and it returns real, consistent values pulled from a single source of truth.

Putting it together with Lustre

I built Lustre to solve this exact problem. It's two things:

  1. A Flutter component library with 46+ premium components across 3 themes (Clean, Bold, Glass), all built on a strict token system.
  2. An MCP server (lustre-mcp on npm) that exposes those components and tokens to any MCP-compatible AI tool.

Installing it takes one command:

npx lustre-mcp
Enter fullscreen mode Exit fullscreen mode

Then add it to your AI tool's MCP config (Claude Code, Cursor, and Codex all support this). After that, the AI has access to tools like:

  • list_components — the full catalog of available widgets
  • get_component — full source for any free component
  • search_components — find the right widget by intent
  • get_design_tokens — the complete token system
  • get_layout_pattern — pre-built screen layouts (settings, dashboard, profile, onboarding)

Now when you ask Claude Code to "build a settings page," it doesn't guess. It looks up the settings layout pattern, pulls real token values, and produces code that already matches your design system.

A concrete example

Here's the prompt:

Build me a Flutter settings page with a profile section, notification toggles, and an account section.

Without Lustre, you get the default: a ListView of ListTiles with a Switch trailing, inside a Scaffold with a basic AppBar. It's functional and forgettable.

With Lustre installed, the AI pulls LustreSettingsPage, LustreSettingsList, LustreToggle, and the settings layout pattern. The result uses the spacing scale, respects the type ramp, has proper section dividers, supports light and dark mode automatically, and works in all three themes. The difference is not subtle.

Why MCP is the right delivery mechanism

You could solve this with a regular Flutter package, and plenty of projects do. But pub add some_package doesn't change what the AI generates — it just adds code to your project that the AI has to discover. MCP inverts that: the components live inside the AI's context, so they become the most likely thing to sample when you ask for UI.

This is a meaningful shift. Design systems historically target humans. An MCP server targets the model.

What this looks like in practice

If you want to try it today, it's free to install and includes 15 core components (Button, TextField, AppBar, BottomNav, InfoCard, StatCard, Badge, Avatar, ProgressBar, Dialog, Snackbar, Toast, SearchBar, Toggle, Divider) with full source code. Additional layout patterns and specialized components (dashboards, e-commerce, settings suites) are available as paid kits if you want pre-built starter apps.

The install:

# In your project
npx lustre-mcp

# Then add to your Claude Code / Cursor / Codex MCP config
# Full instructions: https://patrickbotkins.com/lustre
Enter fullscreen mode Exit fullscreen mode

No API keys. No accounts. No backend. It's a locally-run process that gives your AI coding assistant a design system to reference.

Takeaways

If you're building Flutter apps with AI assistance and you don't love how they look, you have three levers:

  1. Define design tokens. Even if you never touch AI tooling, this is the single highest-leverage thing you can do for consistency. A good token system is a month of saved review cycles.
  2. Make your tokens discoverable to the AI. Pasting them into every prompt doesn't scale. An MCP server does.
  3. Use a component library built on those tokens. Whether it's Lustre or something you build yourself, the AI generating composed widgets beats the AI generating raw Container + Padding + Text every time.

The stock-Material-defaults problem isn't going away on its own — the training data is what it is. But you can route around it with surprisingly little infrastructure, and the output quality jump is dramatic.

If you want to see the full component catalog and all three themes side-by-side, the showcase is at patrickbotkins.com/lustre. Feedback and issue reports are very welcome.


I'm @theLightDelta on X. If you end up trying lustre-mcp, I'd love to hear what you think — especially if you find something it gets wrong.

Top comments (0)