DEV Community

Cover image for AI-Generated Code Is Quietly Poisoning Your Dependency Tree
Vasu Ghanta
Vasu Ghanta

Posted on

AI-Generated Code Is Quietly Poisoning Your Dependency Tree

You paste a prompt into Copilot or ChatGPT, get back a tidy package.json or requirements.txt, and move on. The service boots. Tests pass. Ship it.

Three months later, your security scanner lights up with 14 CVEs, your Docker image weighs 1.2 GB for a service that sends emails, and nobody on the team can explain why lodash, moment, and request are in a project that was supposed to use the native fetch API.

This is dependency drift, and AI assistants are accelerating it in ways most teams haven't fully reckoned with yet.

Why AI Assistants Generate Bloated Dependency Graphs

Large language models are trained on vast amounts of public code. That code skews heavily toward older patterns — Stack Overflow answers from 2017, tutorials written before Node 18 existed, pip packages that were popular before the standard library caught up.

When you ask an AI to scaffold a REST service, it reaches for what it's seen most often. That means axios instead of native fetch, moment instead of Temporal or date-fns, lodash for utilities that are one-liners in modern JavaScript. In Python, it might pull in requests when httpx is already in your stack, or add PyYAML when you're already using pydantic with built-in TOML support.

The AI isn't wrong, exactly. These packages work. But they're not your packages, and they're not always the right ones for 2024.

What a Real AI-Generated Dependency Graph Looks Like

Here's a trimmed package.json that Claude generated for a "simple webhook receiver" when given a minimal prompt:

{
  "dependencies": {
    "axios": "^1.4.0",
    "body-parser": "^1.20.2",
    "dotenv": "^16.0.3",
    "express": "^4.18.2",
    "lodash": "^4.17.21",
    "moment": "^2.29.4",
    "uuid": "^9.0.0",
    "winston": "^3.8.2"
  }
}
Enter fullscreen mode Exit fullscreen mode

body-parser has been bundled into Express since v4.16. moment is maintenance-only and ships 300 KB of locale data. uuid can be replaced with crypto.randomUUID(). axios duplicates fetch. That's four unnecessary dependencies before you've written a single line of business logic.

A comparable Python scaffold for a FastAPI service regularly pulls in requests, python-dotenv, and pydantic-settings alongside pydantic v2, even though pydantic-settings is a separate install that overlaps with environment handling the developer may already have.

In Helm charts, the pattern is even messier — AI-generated charts frequently include full cert-manager, ingress-nginx, and metrics-server dependencies for services that will run inside a cluster where those are already provided globally.

How to Detect the Problem

Eyeballing a lockfile doesn't scale. You need automated signals.

For npm projects, depcheck identifies packages that are declared but never imported:

npx depcheck --ignores="eslint-*,@types/*"
Enter fullscreen mode Exit fullscreen mode

Pair it with npm audit and bundlephobia-cli to surface both security risk and size impact simultaneously.

For Python, pip-audit covers CVE exposure and deptry does what depcheck does for npm — finds unused or misplaced dependencies across your pyproject.toml.

pip install deptry && deptry .
Enter fullscreen mode Exit fullscreen mode

For Helm, helm lint with a strict values.yaml schema and helm dependency list will expose charts pulled in transitively that you didn't intend to include.

The Fix: A Linter-Style Guardrail Rule Set for CI

The most durable solution isn't better prompts — it's treating dependency hygiene the same way you treat code style: enforce it in CI, fail the build when it breaks.

Here's a minimal rule set worth implementing:

Rule 1 — No deprecated packages. Maintain a blocklist (moment, request, body-parser standalone, etc.) checked on every PR. A simple jq query against package-lock.json or a custom deptry config handles this.

Rule 2 — No duplicate-purpose packages. If axios and node-fetch are both present, fail. If requests and httpx coexist, fail. This is detectable via dependency graph analysis.

Rule 3 — Lockfile must be committed and reproducible. Any PR that changes package.json without a corresponding lockfile change should block merge.

Rule 4 — Size budgets for production images. Set a maximum image size threshold in your CI pipeline. A webhook receiver has no business being over 200 MB.

Rule 5 — CVE gates. pip-audit and npm audit --audit-level=high should be non-negotiable steps, not optional ones.

A GitHub Actions snippet that wires this together:

- name: Dependency audit
  run: |
    npm audit --audit-level=high
    npx depcheck
    npx bundlephobia-cli --limit 50kb package.json
Enter fullscreen mode Exit fullscreen mode

The Underlying Tension

There's a real tradeoff here. AI-generated scaffolding saves hours. Enforcing strict dependency policies adds friction to that speed. The answer isn't to abandon AI tooling — it's to stop treating AI output as the final word on project structure.

Think of the AI's output as a first draft from a competent junior developer who learned to code in 2019 and hasn't kept up with ecosystem changes. You wouldn't merge that PR without review. Don't skip the review just because the author is a language model.

The teams that will get this right are the ones who build golden-path templates — curated, maintained package.json or pyproject.toml starters — and use AI to fill in logic, not structure. Let the AI write the handler function. You decide what's in the dependency tree.

Your lockfile is a security surface. Treat it like one.

Top comments (4)

Collapse
 
bhavin-allinonetools profile image
Bhavin Sheth

This hit home. I’ve started noticing the same thing when using AI to scaffold small tools — it adds packages I never asked for. Recently it included axios and lodash in a project where native fetch and plain JS were enough.

Now I always review dependencies before installing and run depcheck after setup. It keeps the project lighter and avoids security and maintenance problems later. AI saves time, but dependency decisions still need human review.

Collapse
 
vasughanta09 profile image
Vasu Ghanta

Bhavin, I appreciate you sharing your experience. The axios/lodash example is exactly the kind of subtle bloat I’ve been seeing too—especially when native APIs are perfectly sufficient. Your approach of reviewing dependencies upfront and running depcheck is a great safeguard. AI can accelerate scaffolding, but as you said, dependency choices still require deliberate human judgment. Thanks for reinforcing that point with a practical workflow.

Collapse
 
klement_gunndu profile image
klement Gunndu

The CI blocklist approach is solid, but I'd push back on failing builds for duplicate-purpose packages — sometimes httpx and requests coexist intentionally during a migration. A warning with a required justification comment might scale better.

Collapse
 
vasughanta09 profile image
Vasu Ghanta

Klement, that’s a fair point. During migrations, temporary overlap like httpx and requests can absolutely be intentional. I agree that an enforced failure might be too rigid in some cases—requiring a justification comment alongside a warning could strike a better balance between governance and flexibility. The goal isn’t to block progress, but to make duplication explicit and intentional. Thanks for adding that nuance.