DEV Community

Oluwamayowa Adeoni
Oluwamayowa Adeoni

Posted on

Building Scalable AI Apps with React & FastAPI

Step-by-step guide to integrating frontend and backend for production ML apps

In the AI era, it’s no longer enough to train great models; you need to deploy them in apps people can actually use.

This guide walks you through building scalable, real-world AI applications using React (or Next.js) on the frontend and FastAPI on the backend. A powerful stack for shipping intelligent tools quickly and effectively.


Architecture Overview

[React / Next.js]  <--->  [FastAPI Backend]  <--->  [ML Model / LLM API]
          ⬆                        ⬇
     User Interface         Business Logic + Inference
Enter fullscreen mode Exit fullscreen mode

Tech Stack

  • Frontend: React.js or Next.js
  • Backend: FastAPI (Python, async-ready)
  • Model Serving: Custom models, Hugging Face, or OpenAI APIs
  • Database (optional): Supabase, PostgreSQL, MongoDB
  • Deployment: Vercel (frontend) + Render / Railway / Fly.io (backend)

Step-by-Step Guide

Set Up Your FastAPI Backend

mkdir ml-backend && cd ml-backend
python -m venv venv && source venv/bin/activate
pip install fastapi uvicorn
Enter fullscreen mode Exit fullscreen mode

main.py

from fastapi import FastAPI
from pydantic import BaseModel

app = FastAPI()

class TextInput(BaseModel):
    text: str

@app.post("/api/analyze/")
def analyze_text(payload: TextInput):
    result = {"length": len(payload.text), "uppercase": payload.text.upper()}
    return result
Enter fullscreen mode Exit fullscreen mode

Run it:

uvicorn main:app --reload
Enter fullscreen mode Exit fullscreen mode

Create the React Frontend

npx create-react-app ml-frontend
cd ml-frontend
npm install axios
Enter fullscreen mode Exit fullscreen mode

App.js

import { useState } from "react";
import axios from "axios";

function App() {
  const [text, setText] = useState("");
  const [result, setResult] = useState(null);

  const handleSubmit = async () => {
    const res = await axios.post("http://localhost:8000/api/analyze/", { text });
    setResult(res.data);
  };

  return (
    <div style={{ padding: "2rem" }}>
      <h2>Text Analyzer</h2>
      <textarea value={text} onChange={(e) => setText(e.target.value)} />
      <br />
      <button onClick={handleSubmit}>Analyze</button>
      {result && (
        <div>
          <p>Uppercase: {result.uppercase}</p>
          <p>Length: {result.length}</p>
        </div>
      )}
    </div>
  );
}

export default App;
Enter fullscreen mode Exit fullscreen mode

Swap in Your ML Model

Let’s plug in a real Hugging Face sentiment classifier:

pip install transformers
Enter fullscreen mode Exit fullscreen mode
from transformers import pipeline
classifier = pipeline("sentiment-analysis")

@app.post("/api/analyze/")
def analyze_text(payload: TextInput):
    result = classifier(payload.text)
    return {"label": result[0]['label'], "score": result[0]['score']}
Enter fullscreen mode Exit fullscreen mode

Deployment Tips

Enable CORS in FastAPI:

pip install fastapi[all]
Enter fullscreen mode Exit fullscreen mode
from fastapi.middleware.cors import CORSMiddleware

app.add_middleware(
  CORSMiddleware,
  allow_origins=["*"],
  allow_credentials=True,
  allow_methods=["*"],
  allow_headers=["*"],
)
Enter fullscreen mode Exit fullscreen mode

Using OpenAI or LangChain

Want to make it LLM-powered?

pip install openai
Enter fullscreen mode Exit fullscreen mode
import openai

@app.post("/api/analyze/")
def analyze_text(payload: TextInput):
    response = openai.ChatCompletion.create(
        model="gpt-4",
        messages=[{"role": "user", "content": payload.text}]
    )
    return {"reply": response["choices"][0]["message"]["content"]}
Enter fullscreen mode Exit fullscreen mode

Final Thoughts

This stack gives you the best of both worlds:

  • Python's flexibility for AI/ML
  • React’s power for interactive UIs

Start simple, scale fast.


Useful Links

Top comments (0)

Some comments may only be visible to logged-in visitors. Sign in to view all comments.