DEV Community

PMPK Labs
PMPK Labs

Posted on

How to Make AI Coding Assistants Actually Useful (Stop Fighting Framework Drift)

AI coding assistants are legitimately useful. They're also consistently wrong in a very specific, frustrating way — and most developers haven't found the fix yet.

This post explains the problem and how to solve it permanently.

The Problem: Framework Drift

Let me describe something that probably sounds familiar.

You're working on a Next.js 15 project using the App Router. You ask your AI assistant to add server-side data fetching to a new page. It generates this:

export async function getServerSideProps(context) {
  const data = await fetchData(context.params.id);
  return { props: { data } };
}
Enter fullscreen mode Exit fullscreen mode

That's Pages Router. It doesn't work in App Router. You correct it, and a few prompts later it does it again.

Or you're working on a React 19 component. The AI wraps a component in forwardRef — a pattern that's now unnecessary because ref is a regular prop. Or it suggests a useEffect pattern that React's new compiler handles automatically.

This isn't the AI making things up. It's the AI pattern-matching to its training data — which contains years of code written before these frameworks made major shifts.

Here's the scale of the problem across common stacks:

Next.js: Pages Router dominated from 2016 to 2023. App Router became stable in Next.js 13 (2023) and is the default in Next.js 15. The training corpus is heavily weighted toward Pages Router patterns.

React: React 19 landed in late 2024 with the compiler, ref-as-prop, and other changes. Patterns from React 16-18 are deeply encoded from years of training data.

SvelteKit: Svelte 5 introduced runes — a complete syntax shift for reactivity. Nearly all existing Svelte code uses the older pattern.

Flutter: Widget lifecycle, state management approaches, and null safety patterns have all evolved significantly since Flutter 1.x.

The result: you spend a meaningful percentage of your AI coding time reviewing output for staleness. That's the opposite of the productivity gain you're supposed to be getting.

The Fix: Rules Files

Every major AI coding tool supports some form of persistent instruction file:

  • Cursor: .cursorrules in project root
  • Claude Code: .claude/CLAUDE.md
  • Cline: .clinerules
  • Windsurf: .windsurfrules

These files are injected as context at the start of every conversation. The AI treats them as high-priority instructions that shape every response it gives.

The idea is simple: you tell the AI exactly which patterns to use for your specific project, and it follows them.

Here's a practical example. A basic .cursorrules for a Next.js 15 App Router project:

# Next.js 15 App Router Project

## Architecture
- This project uses Next.js 15 with the App Router. Never suggest Pages Router patterns.
- Default to Server Components. Only add \`use client\` when the component needs:
  - Browser APIs (window, document, navigator)
  - Event handlers (onClick, onChange, etc.)
  - React hooks that require client context (useState, useEffect)

## Data Fetching
- Use async Server Components for data fetching, not getServerSideProps or getStaticProps.
- Use the fetch() API with Next.js cache options.
- Route handlers go in app/api/[route]/route.ts using the new Response API.

## Metadata
- Use generateMetadata() export from page.tsx files. Never import from next/head.
Enter fullscreen mode Exit fullscreen mode

After adding this file, the AI stops suggesting getServerSideProps. It defaults to Server Components. It uses generateMetadata(). The suggestions align with what the codebase actually needs.

Writing Rules That Work

Good rules files are specific, not general. "Follow best practices" does nothing. "Never use getServerSideProps — this project uses App Router" does a lot.

Effective rules have three properties:

1. They name the anti-pattern explicitly.

Don't just say what to do — say what not to do and why.

2. They handle the edge cases.

The common cases the AI already handles okay. Write rules for the patterns it consistently gets wrong.

3. They're maintained with the framework.

A .cursorrules file written for Next.js 13 will drift out of date as Next.js 15 evolves. Treat it like documentation — update it when you hit new anti-patterns.

The Time Investment (and How to Skip It)

Here's the honest version: writing genuinely good rules files for a single stack takes several hours. You need to read the migration guides, identify every pattern the AI gets wrong, write rules precise enough to override the training prior, and test them across different task types.

I went through that process over about 3 months across the stacks I work with professionally: Next.js 15, React 19, Flutter, FastAPI, Spring Boot 3, SvelteKit, Bun+Hono, Astro 5, Nuxt 3, and Expo. The result is 2,000+ lines of rules that I've tested against real project work.

I packaged them as DevRules Pro — available for $19 one-time. It includes an interactive CLI installer so you can pick which stacks to install and drop the files in the right place automatically.

Start Today

If you take nothing else from this: open your current project, create a .cursorrules file, and add 5-10 rules about your specific stack and patterns.

You'll notice the difference in the next coding session.

The AI coding tools we have today are genuinely capable. They just need context — and the rules file is how you give it to them.


DevRules Pro — 10 battle-tested rules files for the stacks shipping in 2025. Works with Cursor, Claude Code, Cline, Windsurf.
https://pooripatkh.github.io/devrules-pro/

Top comments (0)