DEV Community

Cover image for Is OpenAI really running ChatGPT on a single PostgreSQL instance?
Vardan Matevosian
Vardan Matevosian

Posted on • Originally published at matevosian.tech

Is OpenAI really running ChatGPT on a single PostgreSQL instance?

The headline of OpenAI’s recent article https://openai.com/index/scaling-postgresql/ feels a bit clickbaity,
If they truly used only one database instance, ChatGPT would’ve been dead on arrival.

But the reality is far more impressive: they’re using the right tool for each layer of the persistent stack:

  • They push PostgreSQL to its absolute limits, with a single primary writer, yes, but backed by nearly 50 read replicas across global regions.

  • For write-heavy workloads, they’ve wisely migrated to sharded systems like Azure Cosmos DB.

  • They’ve added layers of resilience:

    • connection pooling with PgBouncer,
    • query rate limiting,
    • caching with lock leasing,
    • cascading replication (in testing),
    • strict schema-change policies.

It’s not magic, it’s mature, thoughtful engineering at scale.

If you work with databases, this post is absolutely worth reading.

Top comments (0)