DEV Community

Cover image for Why you should choose Starlette for your next SaaS
Siddhant Goel
Siddhant Goel

Posted on

Why you should choose Starlette for your next SaaS

Starlette is a relatively recent entrant to the world of Python web frameworks, particularly if you compare it to the more established ones like Django, Flask, etc.

In this post, I'd like to make the case that in spite of its recency, it brings so much to the table that you should give it a serious consideration if you're starting a new project today.

1. Starlette is asynchronous by default

Asynchronous programming is not new by any means. But mainline Python support for high-level asynchronous primitives can definitely be considered recent.

To be fair, Python has had the asyncore module for years (decades?) now, which enabled asynchronous programming at a rather lower level. However, with the introduction of asyncio, writing asynchronous code in Python has gotten a lot more approachable and popular.

Starlette builds on top of all this and gives you an asynchronous web toolkit. When you're building a SaaS, this can enable a ton of use cases that are usually not possible with synchronous web servers because of the (different) concurrency model they're running under.

Here are two examples:

1.1 Serving multiple requests in parallel

Assume that your SaaS application is starting to gain traction. At some point, you may need the ability to serve multiple users in parallel.

With conventional WSGI-based synchronous web servers, you can serve multiple requests at the same time by running more backend processes. If on the other hand your application is built on top of an asynchronous framework, you can achieve a lot more within a single process. This is because each individual process can utilize the CPU time more efficiently by allowing context-switches while performing I/O.

As always, it's hard to make generalizations in software. The same applies here. A lot of these performance aspects depend on the individual characteristics of your application. Regardless, trying to figure out if any specific aspects of your application can benefit from the asynchronous concurrency model can be worth a shot.

1.2. Making network calls without affecting other requests

In user-facing web applications, there are often use cases where you want to make network calls over the Internet as part of serving a user's HTTP request. Calling out external APIs, for instance an email delivery service, is a good example.

Performing such actions inside the main thread of a synchronous web application will block the entire application until the network call finishes. In an asynchronous application, the server is free to switch context to do something else as long as the first context is busy with (network) I/O.

This means that if there's a situation where the server is processing two user requests and the first one makes a network call, the server can switch to processing the second request while the first request is waiting for the network call to finish. How neat is that?!

2. Starlette gives you very high performance

The recommended mode to run Starlette applications is under Uvicorn.

Uvicorn is an ASGI server implementation written in Python. ASGI, short for Asynchronous Server Gateway Interface, provides a standard interface between asynchronous Python web servers, frameworks, and applications. Uvicorn is one of the fastest implementations of the ASGI interface and pushes hard on performance.

From their homepage:

ASGI should help enable an ecosystem of Python web frameworks that are highly competitive against Node and Go in terms of achieving high throughput in IO-bound contexts. It also provides support for HTTP/2 and WebSockets, which cannot be handled by WSGI.

Installing Uvicorn using pip install uvicorn[standard] will replace (if possible) the standard library's async event loop with uvloop which is an event loop written in Cython (which in turn is a superset of Python that compiles down to fast C code).

There is a lot to unpack here which will mostly be a digression from what we're talking about. The gist of it all is that Starlette combined with Uvicorn is fast. Their documentation refers the readers to this performance benchmark which shows that this combination is indeed one of the fastest combinations available in the Python ecosystem, serving slightly more than 71,000 requests per second.

3. The API is minimal but solid

One of the hallmarks of well-built APIs is that you can intuit where things are going to be. In my experience working with Starlette, that certainly seems to be the case here.

The overall API that Starlette gives you is very intuitive. The main entry point is a Starlette object which accepts URL routes and the associated handlers as initialization parameters (amongst other parameters). This already makes up for a major chunk of your application. The rest of the code - things like database access, template rendering, authentication, etc. - is organized neatly into distinct modules that you can import.

For instance, if I want to have my application behave differently depending on configuration variables, starlette.config is the package I need to look into. Different kinds of HTTP response classes are available under
starlette.responses. Similarly, the authentication utilities are bundled inside starlette.authentication.


I hope that through this article I could give you a sense of the simplicity and power that Starlette has to offer. If you're looking to build a SaaS using Starlette, we offer an industry-standard boilerplate to kick-start your
application. Alternatively if you'd like to learn more about Starlette, their documentation is an excellent resource to peruse.


Photo by Jeremy Thomas on Unsplash

Top comments (0)