Pydantic Added 140ms to Every Request
A Pydantic validator on our API's request model was costing 140ms per request — just to check that an integer was positive and a string matched a regex. The actual business logic took 80ms. We were spending more time validating than working.
Switching to Python's built-in dataclasses dropped that overhead to under 20ms. Same type safety in our IDE, same structured data, 7x faster runtime. The catch? We lost automatic coercion (strings to ints, loose JSON parsing) and had to write our own validation for the two fields that actually needed it. Turns out we didn't need Pydantic's kitchen sink — we needed a scalpel, not a Swiss Army knife.
This isn't an anti-Pydantic post. It's a reminder that every abstraction has a cost, and sometimes the cost exceeds the benefit. Here's when dataclasses beat Pydantic, when Pydantic is worth every millisecond, and the specific benchmark numbers that'll help you decide.
The Overhead Nobody Mentions in Tutorials
Continue reading the full article on TildAlice

Top comments (0)