DEV Community

Cover image for Build an AI Assistant Web App Using Streamlit
vaasav kumar
vaasav kumar

Posted on • Edited on

Build an AI Assistant Web App Using Streamlit

In the previous blog, we built an AI assistant using PromptTemplate and LangChain to generate:

  • a clinic name
  • possible clinic locations

In this blog, we will take the same idea and turn it into a simple web app using Streamlit.

This makes the project more practical because users can:

  • select a clinic type
  • choose a city
  • set a distance range
  • instantly generate a clinic name and suggested locations

First, install the Streamlit

pip install streamlit
Enter fullscreen mode Exit fullscreen mode

Store API Keys in a Separate File

To keep the code clean, we can store API keys in a separate Python file.

Create a file named secret_keys.py

# secret_keys.py
arliai_api_key = "xxxx"
cerebras_api_key = "yyyy"
openai_api_key = "zzzz"
Enter fullscreen mode Exit fullscreen mode

Create LLM Configuration File

Now create a file named llm_conf.py

This file will:

  • load API keys
  • configure providers
  • allow switching between LLMs easily
# llm_conf.py
import os
from secret_keys import arliai_api_key, cerebras_api_key, openai_api_key
from langchain_openai import ChatOpenAI

os.environ['ARLIAI_API_KEY'] = arliai_api_key
os.environ['CEREBRAS_API_KEY'] = cerebras_api_key
os.environ['OPENAI_API_KEY'] = openai_api_key

llm_providers = {
    "arliai": {
        "api_key": os.getenv("ARLIAI_API_KEY"),
        "base_url": "https://api.arliai.com/v1",
        "model": "GLM-4.7",
    },
    "cerebras": {
        "api_key": os.getenv("CEREBRAS_API_KEY"),
        "base_url": "https://api.cerebras.ai/v1",
        "model": "llama3.1-8b",
    },
    "openai": {
        "api_key": os.getenv("OPENAI_API_KEY"),
        "base_url": "https://api.openai.com/v1",
        "model": "gpt-4o-mini",
    }
}

def set_llm(llm_type, creative_level=0.7):
    my_provider = llm_providers[llm_type]
    llm = ChatOpenAI(
        model=my_provider["model"],
        api_key=my_provider["api_key"],
        base_url=my_provider["base_url"],
        temperature=creative_level,
    )
    return llm
Enter fullscreen mode Exit fullscreen mode

This setup makes your app flexible.

You can switch providers like this:

llm = set_llm("openai")
Enter fullscreen mode Exit fullscreen mode

or

llm = set_llm("cerebras")
Enter fullscreen mode Exit fullscreen mode

Add Helper Assistant Function

Now create a file named helpers.py

This file will contain the logic to:

  • generate a clinic name
  • generate possible clinic locations
  • return both results together
# helpers.py
from langchain_core.output_parsers import StrOutputParser
from langchain_core.runnables import RunnablePassthrough
from langchain_core.prompts import PromptTemplate

def search_location_for_clinic_with_name(llm, clinic_type, city, km):
    parser = StrOutputParser()

    name_template = PromptTemplate.from_template(
        "Opening New {clinic_type} clinic. Generate only 1 professional name for that."
        "No explanation, no numbering."
    )

    search_location_template = PromptTemplate.from_template(
        "Opening a new clinic for {new_clinic}, search 5 locations in {city} where no other opponents are found within {km} distance."
        "Return ONLY comma-separated area names. No explanation, no numbering."
    )

    name_chain = name_template | llm | parser

    full_chain = (
        RunnablePassthrough()
        | {
            "new_clinic": name_chain,
            "city": lambda x: x["city"],
            "km": lambda x: x["km"],
        }
        | {
            "clinic_name": lambda x: x["new_clinic"],
            "locations": search_location_template | llm | parser,
        }
    )

    response = full_chain.invoke({
        "clinic_type": clinic_type,
        "city": city,
        "km": km
    })

    return response
Enter fullscreen mode Exit fullscreen mode

Build the Streamlit Web App

Now create the main file named clinic.py

This file will:

  • create the web interface
  • show select boxes and inputs
  • call the helper function
  • display the result
# clinic.py
import streamlit as st

from llm_conf import set_llm
from helpers import search_location_for_clinic_with_name

llm = set_llm('openai')

st.title("Generate Clinic Name & Locations")

clinic_type = st.sidebar.selectbox(
    "Pick a Type of Clinic",
    ("Ayurvedic", "Dental", "Eye", "Gynaecology", "Orthopedic", "Pediatric", "Physiotherapy", "Psychiatry", "Urology")
)

city = st.sidebar.selectbox(
    "Pick a City",
    ("Chennai", "Bangalore", "Mumbai", "Delhi", "Hyderabad", "Pune", "Madurai", "Jaipur", "Kochi", "Coimbatore")
)

km = st.sidebar.number_input(
    "Pick a Distance in KM",
    min_value=1,
    max_value=100,
    value=7
)

if clinic_type and city and km:
    response = search_location_for_clinic_with_name(llm, clinic_type, city, km)
    locations = response['locations'].strip().split(",")

    clinic_name = response['clinic_name'].strip()
    st.header(f"{clinic_type} Clinic: {clinic_name}")

    st.subheader("Generated Locations")
    for item in locations:
        st.write(f"- {item}")
Enter fullscreen mode Exit fullscreen mode

Run the Streamlit App

streamlit run clinic.py
Enter fullscreen mode Exit fullscreen mode

Final Thoughts

In this blog, we converted our AI helper into a Streamlit web app.

We created:

  • a file for API keys
  • a file for LLM configuration
  • a helper function for clinic name and location generation
  • a simple web app UI to run everything

This is a great starting point for building AI-powered tools with a real interface.

In the next blog, we can improve this further by adding Langchain Tools.

Top comments (0)