DEV Community

Flynn L
Flynn L

Posted on

Show Dev: I built an AI research workbench with React, FastAPI, and multi-agent LLMs

I built Arbor, a tool that takes research questions and builds interactive knowledge graphs from academic papers.

How the pipeline works

  1. Moderation — GPT-4o-mini screens the question for prohibited content
  2. Decomposition — Gemini 2.5 Flash breaks the question into 4 focused sub-inquiries
  3. Search — Parallel queries to arXiv and Semantic Scholar APIs
  4. Screening — Gemini 2.0 Flash scores papers for relevance
  5. Extraction — Gemini 2.0 Flash pulls key findings from top papers
  6. Synthesis — Gemini 2.5 Flash assembles findings into structured markdown

All stages stream via SSE, so the graph builds in real-time.

One interesting architecture decision:

I used a fallback decomposition strategy. If the LLM decompose call fails (timeout, API down, rate limited), the backend generates 4 templated inquiries based on the question stem. The research continues with degraded but functional results instead of showing an error. Users barely notice.

Stack

  • Frontend: React 19, TypeScript, Vite, Tailwind, React Flow
  • Backend: FastAPI, SQLAlchemy, aiosqlite
  • LLMs: OpenRouter (Gemini 2.5 Flash, Gemini 2.0 Flash, GPT-4o-mini)
  • Hosting: Vercel (frontend) + Railway (backend, $5/mo)

Try it free: arborinquiries.com

I'd love feedback — especially on the graph UX and research quality.

Top comments (0)