DEV Community

Cover image for How to Connect a FastAPI Server to PostgreSQL and Deploy on GCP Cloud Run
Glenn Viroux
Glenn Viroux

Posted on

How to Connect a FastAPI Server to PostgreSQL and Deploy on GCP Cloud Run

In the world of web development, a robust backend is crucial for delivering a high-quality user experience. One popular approach for building backends is by using RESTful APIs, which enable clients to communicate with servers using HTTP requests.

When I was building Ballistic, I chose for FastAPI as the web framework for my backend API in Python, since it had gained significant popularity in the Python community in a very short amount of time. Apart from its performance, powered by the usage of the asyncio library, I was convinced by the core integration of type annotations and the well-built documentation. Take into account that good documentation and a broad community can save you tons of time when developing your backend.

Building a web application involves more than just a web server; it requires a reliable and efficient database to store and manage the application's stateful data. Among the various database options available, PostgreSQL stands out as a popular choice for developers. Its extensive community and ecosystem provide invaluable support, making it easier to solve problems. PostgreSQL's reputation for reliability and stability further solidifies its position as a trusted solution. Additionally, PostgreSQL's compatibility with multiple platforms ensures flexibility in deployment. In this blog post, we will explore how to connect our FastAPI server with a PostgreSQL database. It's worth noting that the guide can be easily adapted for other databases like MySQL or MongoDB with minimal modifications.

When it comes to deploying REST servers, there are many options to choose from. One particularly appealing option is to use cloud providers like Google Cloud Platform (GCP), which offer scalable, stateless services like Cloud Run and App Engine. With a cloud provider like GCP, you won’t have to worry about keeping your servers up 100% of the time and making sure they are available, also with high traffic loads. Your initial investment is also significantly lower, as you’ll just have to pay for what you consume, instead of buying a whole server up-front.

In this blog post, we'll explore how to build a REST server using FastAPI, PostgreSQL, and GCP, and deploy it to a stateless environment like GCP Cloud Run. At the end of the blog post, I’ll also include all the source code to help you follow along the explanations given in the post itself.

Why Stateless REST Servers are Ideal for Cloud Deployments

Before diving into the details of building a REST server with GCP, let's first take a step back and consider why stateless servers are ideal for cloud deployments.

When a server is stateless, it means that it doesn't store any session data or state information between requests. This makes it easier to scale and deploy, since any instance of the server can handle any incoming request. This is in contrast to stateful servers, which need to keep track of session data and ensure that each request is handled by the same server instance.

Stateless servers also offer other benefits, such as improved reliability and fault tolerance. Since each request can be handled by any instance of the server, failures or crashes can be quickly detected and handled by spinning up new instances.

While it's essential for your app to keep track of user data, it's important to remember that this responsibility falls on the database. The database is designed to store and maintain data integrity, while the REST server is responsible for handling individual, independent requests that operate on the data stored in the database. By separating these responsibilities, your app can be more efficient and easier to maintain over time.

GCP Cloud Run vs. App Engine

When it comes to deploying REST servers on GCP, there are two main options to choose from: Cloud Run and App Engine.

Cloud Run is a fully-managed, container-based platform that allows you to run stateless HTTP containers in a serverless environment. This means you don't need to worry about managing infrastructure, and you only pay for the resources you use.

App Engine, on the other hand, is a platform-as-a-service (PaaS) offering that allows you to build and deploy web applications and APIs using a variety of programming languages and frameworks. It's a more fully-featured platform than Cloud Run, but it also requires more configuration and management.

Both Cloud Run and App Engine are good options for deploying REST servers, and the choice largely depends on your specific use case and requirements.

Building the Backend

Setting up a new project is the first step towards building any application. While there are multiple tools available for managing packages and dependencies, I personally recommend using poetry for its simplicity and ease of use. Alternatively, you can also use a plain old virtual environment to set up your project.

Assuming you have installed poetry in your system, you can use the following command to create a new project:

poetry new fastapi-gcp
Enter fullscreen mode Exit fullscreen mode

This will create a new directory named fastapi-gcp with some default files and a basic project structure. Next, navigate to the newly created directory using cd fastapi-gcp and install the fastapi package using poetry as shown below:

poetry add fastapi asyncpg
Enter fullscreen mode Exit fullscreen mode

This will install the fastapi package along with its dependencies, making it available for use in your project.

Building the FastAPI Server

Now, let's dive into the code and start building our FastAPI server. With just a few lines of code, we can create a simple server to handle requests. Create a [main.py](http://main.py) file and add the following code:

from fastapi import FastAPI
from fastapi.responses import HTMLResponse

app = FastAPI(title="fastapi-gcp")

@app.get("/", response_class=HTMLResponse)
def healthcheck():
    return "<h1>All good!</h2>"
Enter fullscreen mode Exit fullscreen mode

Here, we are just using a minimal setup to create a server that returns a HTML-ready response when someone opens the server’s root URL in the browser. To run this server on localhost:8080, execute the following command:

uvicorn main:app --host 0.0.0.0 --port 8080
Enter fullscreen mode Exit fullscreen mode

Now that your server is running, make sure to open a browser on the http://localhost:8080 to see the “All good!” message.

Since FastAPI is based on OpenAPI, at this point you can also use the automatically generated docs. There are multiple options, and two are included by default. Try them out by accessing the following URLs:

Setting Up the PostgreSQL DB

Now that our server is up and running, let's spin up our PostgreSQL server as well. The easiest way to accomplish this is by utilizing Docker and the official PostgreSQL image. If you have Docker installed, you can quickly achieve this by running a simple one-liner:

docker run --name postgres-fastapi -e POSTGRES_PASSWORD=postgres-fastapi --rm -p 5432:5432 postgres
Enter fullscreen mode Exit fullscreen mode

Here, we start a container named "postgres-fastapi" with the password "postgres-fastapi," exposing the default PostgreSQL port 5432. If you'd like to delve deeper into this topic, you can find more detailed information in this article: How to Use the PostgreSQL Docker Official Image.

Settings Management

Before connecting our FastAPI server to the PostgreSQL database, it is important to consider how we manage our settings. Storing access credentials to our database in plain text within our Python repository would pose a significant security risk, as it could be easily discovered by hackers or individuals with malicious intent. To address this concern, it is best practice to extract all sensitive credentials to an external environment file, such as an .env file, and then load these values into our FastAPI server. This approach allows for flexibility, as by simply using a different .env file, we can connect to a different database. This proves especially useful when running different versions of the application, such as one in a staging environment and another in the production environment.

Since FastAPI is built on Pydantic, we can leverage the BaseSettings object to facilitate our settings management. To begin, we need to install the additional dependency by executing the following command:

poetry add pydantic[dotenv]
Enter fullscreen mode Exit fullscreen mode

Let’s create a new file named settings.py and add the following code:

from pathlib import Path

from pydantic import BaseSettings

class Settings(BaseSettings):
    DB_NAME: str
    DB_HOST: str
    DB_USER: str
    DB_PASS: str

settings = Settings(_env_file=Path(__file__).parent/ ".env", _env_file_encoding="utf-8")
Enter fullscreen mode Exit fullscreen mode

This code assumes that you have an .env file in the parent directory of this settings.py module that contains your database name (DB_NAME), the host (DB_HOST), the user (DB_USER) and the password (DB_PASS), like this:

DB_NAME=postgres
DB_HOST=localhost
DB_USER=postgres
DB_PASS=postgres-fastapi
Enter fullscreen mode Exit fullscreen mode

Of course, this .env file should be included in your .gitignore file to exclude it from being uploaded to the repository. At this point, you can use your database settings easily anywhere in your python code, like this:

from settings import settings

print(f"My database name is {settings.DB_NAME}")
Enter fullscreen mode Exit fullscreen mode

Connect FastAPI to the PostgreSQL DB

With our FastAPI and PostgreSQL servers up and running, and our settings management in place, we can now proceed to establish a connection between our FastAPI server and the PostgreSQL database.

To do this, we can use the Tortoise-ORM. Begin by installing the package:

poetry add tortoise-orm
Enter fullscreen mode Exit fullscreen mode

Next, create a new file named database.py where we'll define all the details of the database connection. Here's an example:

from fastapi import FastAPI
from tortoise import Tortoise
from tortoise.contrib.fastapi import register_tortoise

from settings import settings

Tortoise.init_models(["models"], "models")

TORTOISE_ORM = {
    "connections": {
        "default": f"postgres://{settings.DB_USER}:{settings.DB_PASS}@{settings.DB_HOST}:5432/{settings.DB_NAME}",
    },
    "apps": {
        "models": {
            "models": ["models", "aerich.models"],
            "default_connection": "default",
        },
    },
    "use_tz": False,
}

def init_db(app: FastAPI) -> None:
    register_tortoise(
        app,
        config=TORTOISE_ORM,
        modules={"models": ["models", "aerich.models"]},
        generate_schemas=True,
        add_exception_handlers=True,
    )
Enter fullscreen mode Exit fullscreen mode

In this [database.py](http://database.py) module, we define the core config of the ORM in the TORTOISE_ORM object, after which we define a function init_db that actually registers the tortoise orm to an existing FastAPI application.

Note also that we're using the tortoise/aerich tool to perform database migrations. Database migrations are a crucial aspect of managing the evolution of a database schema over time. As an application evolves, the database schema often needs to be modified to accommodate new features, fix bugs, or optimize performance. A database migration tool like Aerich provide a systematic approach to apply these changes while preserving existing data. Make sure to install the aerich dependency by running:

poetry add aerich
Enter fullscreen mode Exit fullscreen mode

Let’s now use the init_db function in the [main.py](http://main.py) module where we define the FastAPI application:

from fastapi import FastAPI
from fastapi.responses import HTMLResponse

app = FastAPI(title="fastapi-gcp")
init_db(app)

@app.get("/", response_class=HTMLResponse)
def healthcheck():
    return "<h1>All good!</h2>"
Enter fullscreen mode Exit fullscreen mode

After implementing the init_db function, your FastAPI server has the capability to connect to your database. However, before proceeding, we need to create our first database model and initialize the Tortoise ORM integration by running the Aerich migrations for the first time. Let's create our initial Note model, which will correspond to a notes table in the PostgreSQL database. This table will store a collection of notes with attributes such as filenames, titles, and content.

from tortoise import fields
from tortoise.models import Model

class Note(Model):
    id = fields.IntField(pk=True)
    created_at = fields.DatetimeField(auto_now_add=True)
    updated_at = fields.DatetimeField(auto_now=True)
    filename = fields.CharField(max_length=256)
    title = fields.CharField(max_length=1000)
    content = fields.TextField(default="")
Enter fullscreen mode Exit fullscreen mode

Now, let’s initialize the aerich migrations by running:

aerich init -t database.TORTOISE_ORM
aerich init-db
aerich migrate
aerich upgrade
Enter fullscreen mode Exit fullscreen mode

Next up, we’ll expand our [main.py](http://main.py) server module with two endpoints: one to create a new note in the database, and one to filter existing notes by providing their title:

from fastapi import FastAPI, HTTPException
from fastapi.responses import HTMLResponse
from pydantic import BaseModel

from database import init_db
from models import Note

app = FastAPI(title="fastapi-gcp")
init_db(app)

@app.get("/", response_class=HTMLResponse)
def healthcheck():
    return "<h1>All good!</h2>"

class CreateNotePayload(BaseModel):
    filename: str
    title: str
    content: str

@app.post("/notes")
async def create_note(payload: CreateNotePayload):
    note = await Note.create(**payload.dict())
    return {"message": f"Note created successfully with id {note.id}"}

@app.get("/notes/{title}")
async def get_note_by_title(title: str):
    if not (note := await Note.get_or_none(title=title)):
        raise HTTPException(status_code=404, detail="Note not found")

    return note.id
Enter fullscreen mode Exit fullscreen mode

You can conveniently test the endpoints and the database connection by utilizing the Swagger or Redoc automatic documentation tools. Start by using the POST endpoint to create a new note, and then employ the GET endpoint to retrieve the ID of the recently created note. If this process succeeds, congratulations! You have successfully established a connection between your FastAPI server and the PostgreSQL database.

Deploying to GCP

Now that we have developed our backend and established the connection between our FastAPI server and the database, let's delve into the process of deploying our application to a cloud provider like GCP (Google Cloud Platform).

Deploying a PostgreSQL DB to GCP

Up until this point, we have been working with a local PostgreSQL server to test our application. However, when deploying our application to a cloud provider, it is essential to use a PostgreSQL server that runs in the cloud. Fortunately, you have several options for achieving this.

The most straightforward choice is to utilize the Cloud SQL service provided by GCP. Cloud SQL is a fully-managed database service that handles the PostgreSQL instance for you. Another option is to deploy the PostgreSQL database on a compute engine instance using the same PostgreSQL Docker image we used earlier for our local instance.

While deploying on a compute engine instance offers more flexibility, Cloud SQL holds several advantages as a fully managed solution:

  • Managed Service: With Cloud SQL, Google takes care of the underlying infrastructure, maintenance, and updates, relieving you of these responsibilities.
  • Scalability: Cloud SQL provides automatic scalability to handle varying workloads. It allows you to easily adjust resources such as CPU, memory, and storage based on your application's requirements. Scaling can be performed with minimal downtime and without the need to provision additional virtual machines.
  • Backup and Recovery: Cloud SQL offers automated backups of your databases, ensuring the safety of your data. You can effortlessly restore backups or perform point-in-time recovery to a specific moment in the past. Automated backups can be enabled, and the retention period can be set according to your needs.

It's important to note that these advantages come with a higher cost. The pricing for Cloud SQL depends on various factors, including load, CPU usage time, disk usage, and memory usage. For smaller databases, the costs of using Cloud SQL can be up to 10 times higher than deploying the same database on a compute engine instance. To get an estimate of the prices for different options, you can use the GCP Pricing Calculator.

Regardless of the deployment option you choose, make sure to update the database credentials in the .env file you created earlier to match the correct credentials of your deployed PostgreSQL database.

Containerize the FastAPI Server

To facilitate the deployment of your application on GCP, it is recommended to containerize it using Docker. This involves creating a Dockerfile that installs Poetry, copies the FastAPI server code into the image, and configures the Uvicorn server as the entrypoint. Below is an example of a Dockerfile that accomplishes these tasks:

FROM python:3.11

WORKDIR /app
ENV PYTHONFAULTHANDLER 1
ENV PYTHONUNBUFFERED 1
ENV PIP_NO_CACHE_DIR off
ENV PIP_DISABLE_PIP_VERSION_CHECK on

RUN curl -sSL https://install.python-poetry.org | python3 -
ENV PATH="/root/.local/bin:${PATH}"

COPY poetry.lock .
COPY pyproject.toml .

RUN POETRY_VIRTUALENVS_CREATE=false poetry install --only main --no-interaction --no-ansi

COPY . /app

EXPOSE 8080

CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8080"]
Enter fullscreen mode Exit fullscreen mode

In this Dockerfile, we start with the official Python base image, specify the working directory as /app, and install Poetry for managing the project's dependencies. Next, we copy the poetry.lock and pyproject.toml files into the container and install the project dependencies using Poetry. Then, we copy the FastAPI server code from the root directory on the host machine to the /app directory inside the container.

We expose port 8080, assuming that the FastAPI server will listen on that port. If your server uses a different port, make sure to update the EXPOSE statement accordingly.

Finally, we set the entrypoint command to run the Uvicorn server with the appropriate parameters. In this example, it runs the app module's main FastAPI instance, binds it to all network interfaces (0.0.0.0), and listens on port 8080.

Feel free to modify this Dockerfile based on your specific requirements, such as adding additional dependencies or environment variables.

Once you have created your Dockerfile, you can build the image using the following command:

docker build -t fastapi-gcp-backend:1.0 .
Enter fullscreen mode Exit fullscreen mode

If the poetry installation step during this build fails, make sure to remove the

packages = [{include = "fastapi_gcp"}]
Enter fullscreen mode Exit fullscreen mode

line from your pyproject.toml file in case you have it. This should fix the problem of poetry installing in an existing project.

Push image to GCP Artifact Registry

When the build has completed successfully, you can push the image to the GCP artifact registry and select it when creating a GCP cloud run configuration to deploy your application with ease.

To push an image to the GCP artifact registry, you should tag it with the appropriate repository name, and push the image after authenticating to a repository. You can learn more about authenticating to a GCP repository here. You should then tag your image like this:

LOCATION-docker.pkg.dev/PROJECT-ID/REPOSITORY/IMAGE
Enter fullscreen mode Exit fullscreen mode

In our case, this will be:

us-east1-docker.pkg.dev/<your gcp project>/<your repo>/fastapi-gcp-backend
Enter fullscreen mode Exit fullscreen mode

Where you should fill in your GCP project and repository for your project. We can tag and push the image we build earlier like this:

docker tag fastapi-gcp-backend:1.0 us-east1-docker.pkg.dev/<your gcp project>/<your repo>/fastapi-gcp-backend:1.0
docker push us-east1-docker.pkg.dev/<your gcp project>/<your repo>/fastapi-gcp-backend:1.0
Enter fullscreen mode Exit fullscreen mode

Deploy image to GCP Cloud Run

Now that your image has been successfully pushed to the GCP Artifact Registry, you can easily use this image in a Cloud Run configuration. To configure a new Cloud Run service, you can follow these steps using the GCP web console at https://console.cloud.google.com:

  1. Navigate to the Cloud Run product by clicking on the appropriate menu option.
  2. Click on "Create Service" to start the configuration process.
  3. In the "Create Service" page, you will find a section to specify the container image. Choose the option that allows you to select the image from the Artifact Registry.
  4. Select the image you just pushed to the Artifact Registry from the available options.
  5. Configure other settings as per your requirements, such as service name, region, authentication, and scaling options.
  6. Once you have completed the necessary configurations, click on "Create" to create the Cloud Run service.

Image description

In the “Authentication” section, make sure to select “Allow unauthenticated invocations” if you are creating a public API:

Image description

By following these steps, you will be able to configure a Cloud Run service that uses the container image from the Artifact Registry. This allows you to easily deploy your FastAPI application to a scalable and managed environment provided by GCP.

Please note that the steps may vary slightly depending on any updates to the GCP web console interface. Always refer to the GCP documentation for the most up-to-date instructions.

Conclusion

In conclusion, building a robust backend for your application is essential to provide a high-quality user experience. RESTful APIs provide an efficient way to build backends and deploying them on cloud platforms like Google Cloud Platform (GCP) can provide scalability and statelessness to the servers. In this blog post, we explored the benefits of using stateless REST servers on cloud deployments, discussed the differences between Cloud Run and App Engine on GCP, and learned how to build a REST server using FastAPI and PostgreSQL and deploy it to GCP Cloud Run. We also saw how to use Object-Relational Mapping (ORM) like Tortoise ORM to map database tables to Python objects and provided an easy-to-use API for interacting with the database. By following the steps outlined in this post, you can build and deploy a scalable RESTful API that meets your application requirements.

Possible improvements

There are several ways to optimize this workflow, and it is recommended that you explore these options if the benefits outweigh the additional effort required. Some potential improvements include:

  • Use Terraform to define GCP resources such as Cloud Run, Compute Engine, and CloudSQL in a declarative manner. This allows for automated deployments and standardized infrastructure provisioning, reducing the likelihood of human error and providing increased visibility.
  • Consider exploring alternatives to Tortoise-ORM. While the documentation for Tortoise-ORM is comprehensive, a larger community of async ORMs, such as SQLModel, may be better equipped to help with more specific queries.

Source Code

You can find all the source code here: https://github.com/GlennViroux/fastapi-postgres-gcp-blog

Make sure to let me know if you have any comments, questions or suggestions!

Happy coding! :)

Top comments (0)