DEV Community

Midas Tools
Midas Tools

Posted on

What 200 Lines of Python Can Teach You About Building AI Products

Andrej Karpathy just published microgpt — a complete GPT implementation in 200 lines of pure Python. No dependencies. No frameworks. Just math and logic.

It hit the top of Hacker News with 800+ points.

I spent a few hours reading it. Here's what it taught me about building AI products — not as a researcher, but as a founder.


1. Simplicity is a competitive moat

Every AI company right now is racing to add features. More integrations, more dashboards, more complexity.

Karpathy went the other direction. He stripped GPT down to its irreducible core — dataset, tokenizer, autograd, transformer, optimizer — and proved that all the rest is just efficiency.

For founders: your MVP should be able to fit in 200 lines too. If you can't explain what your product does in one sentence, you've already over-engineered it.

Question to ask yourself: What is the minimum thing that delivers value? Ship that. Nothing else.


2. Understanding the substrate gives you leverage

Most AI founders today are API wrappers. They call OpenAI, format the output, charge a markup.

That works — until someone with deeper understanding undercuts you or builds something you can't.

Karpathy's microgpt exists precisely to give people the intuition they need. When you understand why attention works, you can make product decisions that API-wrapper founders can't.

You don't need to implement your own GPT. But you should understand:

  • Why context windows are expensive
  • Why few-shot prompting works
  • Why fine-tuning beats prompting for some tasks

This is what separates AI products that last from ones that get commoditized.


3. The real product is the training data, not the model

Look at what microgpt trains on: 32,000 names. Simple, structured, consistent.

The model learns the statistical patterns — and then generates plausible-sounding new names.

Now think about your AI product. What's the data moat?

  • A legal AI trained on your firm's case history
  • A customer support AI trained on your specific product docs
  • A sales AI trained on your best rep's calls

The model is a commodity. The data is the business.


4. First principles thinking kills analysis paralysis

Karpathy didn't ask "which LLM framework should I use?" He asked: "What is the irreducible set of things a GPT needs to exist?"

Dataset. Tokenizer. Autograd. Architecture. Optimizer. That's it.

Founders get stuck in framework debates, tool comparisons, and infrastructure decisions. Almost none of it matters before you have 10 paying customers.

Start with: "What is the irreducible set of things my customer needs to get value?" Build that. Ignore the rest.


5. Beautiful things get noticed

Here's the thing about microgpt: it got 800+ upvotes on HN not because it's useful (it trains on toy data), but because it's beautiful.

Karpathy literally said: "I think it is beautiful 🥹"

And the community responded.

In a world drowning in AI noise, craft still cuts through. Take pride in the thing you're building. Make it elegant. Make it something you're proud of.

Users can feel the difference between something built with care and something churned out.


The takeaway

Microgpt is a 200-line Python file. But it contains a decade of obsession distilled into its cleanest form.

Your product doesn't need to be a decade in the making. But it should contain your clearest thinking about the problem.

Strip it down. Find the core. Build that.


Building toward $1M ARR in the AI space. Writing about what I learn along the way. Follow along at midastools.co.

Top comments (0)