DEV Community

Cover image for Build & Deploy a Serverless OpenAI App in 9 Lines of Code
Qian Li for DBOS, Inc.

Posted on • Edited on

Build & Deploy a Serverless OpenAI App in 9 Lines of Code

πŸš€ Want to build and deploy an interactive AI app π˜π—Ό π˜π—΅π—² π—°π—Ήπ—Όπ˜‚π—± in just 𝟡 π—Ήπ—Άπ—»π—²π˜€ 𝗼𝗳 𝗰𝗼𝗱𝗲?

In this tutorial, you'll use LlamaIndex to create a Q&A engine, FastAPI to serve it over HTTP, and DBOS to deploy it serverlessly to the cloud.

It's based on LlamaIndex’s 5-line starter, with just 4 extra lines to make it cloud-ready. Simple, fast, and ready to scale!

Preparation

First, create a folder for your app and activate a virtual environment.

python3 -m venv ai-app/.venv
cd ai-app
source .venv/bin/activate
touch main.py
Enter fullscreen mode Exit fullscreen mode

Then, install dependencies and initialize a DBOS config file.

pip install dbos llama-index
dbos init --config
Enter fullscreen mode Exit fullscreen mode

Next, to run this app, you need an OpenAI developer account. Obtain an API key here. Set the API key as an environment variable.

export OPENAI_API_KEY=XXXXX
Enter fullscreen mode Exit fullscreen mode

Declare the environment variable in dbos-config.yaml:

env:
  OPENAI_API_KEY: ${OPENAI_API_KEY}
Enter fullscreen mode Exit fullscreen mode

Finally, let's download some data. This app uses the text from Paul Graham's "What I Worked On". You can download the text from this link and save it under data/paul_graham_essay.txt of your app folder.

Now, your app folder structure should look like this:

ai-app/
β”œβ”€β”€ dbos-config.yaml
β”œβ”€β”€ main.py
└── data/
    └── paul_graham_essay.txt
Enter fullscreen mode Exit fullscreen mode

Load Data and Build a Q&A Engine

Now, let's use LlamaIndex to write a simple AI application in just 5 lines of code.
Add the following code to your main.py:

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader

documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)

query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)
Enter fullscreen mode Exit fullscreen mode

This script loads data and builds an index over the documents under the data/ folder, and it generates an answer by querying the index. You can run this script and it should give you a response, for example:

$ python3 main.py

The author worked on writing short stories and programming...
Enter fullscreen mode Exit fullscreen mode

HTTP Serving

Now, let's add a FastAPI endpoint to serve responses through HTTP. Modify your main.py as follows:

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from fastapi import FastAPI

app = FastAPI()

documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()

@app.get("/")
def get_answer():
    response = query_engine.query("What did the author do growing up?")
    return str(response)
Enter fullscreen mode Exit fullscreen mode

Now you can start your app with fastapi run main.py. To see that it's working, visit this URL: http://localhost:8000

The result may be slightly different every time you refresh your browser window!

Hosting on DBOS Cloud

To deploy your app to DBOS Cloud, you only need to add two lines to main.py:

  • from dbos import DBOS
  • DBOS(fastapi=app)
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from fastapi import FastAPI
from dbos import DBOS

app = FastAPI()
DBOS(fastapi=app)

documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()

@app.get("/")
def get_answer():
    response = query_engine.query("What did the author do growing up?")
    return str(response)
Enter fullscreen mode Exit fullscreen mode

Now, install the DBOS Cloud CLI if you haven't already (requires Node.js):

npm i -g @dbos-inc/dbos-cloud
Enter fullscreen mode Exit fullscreen mode

Then freeze dependencies to requirements.txt and deploy to DBOS Cloud:

pip freeze > requirements.txt
dbos-cloud app deploy
Enter fullscreen mode Exit fullscreen mode

In less than a minute, it should print Access your application at <URL>.
To see that your app is working, visit <URL> in your browser.

Congratulations, you've successfully deployed your first AI app to DBOS Cloud! You can see your deployed app in the cloud console.

Next Steps

This is just the beginning of your DBOS journey. Next, check out how DBOS can make your AI applications more scalable and resilient:

Give it a try and let me know what you think 😊

Top comments (0)