DEV Community

Cover image for 🧠 Build Your Own Document Q&A Assistant with GPT, Redis & Docker
Suliman Munawar khan
Suliman Munawar khan

Posted on

🧠 Build Your Own Document Q&A Assistant with GPT, Redis & Docker

Have you ever wanted to just ask questions about a document—whether it's a research paper, contract, or report—and get an AI-powered answer instantly? That’s exactly what I built: a Document Q&A Assistant that uses GPT-3.5, Redis with vector search, and Docker to make document understanding smarter and more interactive.

In this post, I’ll walk you through the tech stack, what I’ve learned, how everything fits together, and where I plan to take it next.

🚀 Project Overview

This is a backend service that allows users to:

  • Upload a PDF file
  • Ask natural language questions about it
  • Get answers powered by OpenAI’s GPT-3.5-Turbo
  • Behind the scenes, it uses RAG (Retrieval-Augmented Generation) with Redis Vector Search

🧠 What I Learned

This project was a deep dive into several modern technologies and AI concepts:

  • ✅ Redis Stack with Vector Search for storing and retrieving document chunks based on similarity
  • ✅ Docker for containerizing Redis and RedisInsight
  • ✅ RAG architecture to improve the relevance of LLM responses
  • ✅ Connecting AI APIs (OpenAI) with real-world documents
  • ✅ Using pdf-parse and multer for document processing in Node.js

🧬 How It Works (Architecture)

  1. User uploads a PDF
    • PDF is split into readable text chunks.
  2. Chunks are embedded using OpenAI’s Embedding API (text-embedding-ada-002).
  3. Embeddings are stored in Redis using a custom doc_idx vector index.
  4. When a user asks a question, it is also embedded.
  5. Vector similarity search retrieves the top-matching chunks.
  6. GPT-3.5 is prompted using those chunks as context to answer the question.

This is a classic RAG workflow: Retrieve → Augment → Generate.

🧪 Example Use Cases

You can upload any PDF document and ask questions such as:

  • “What are the key findings of this research?”
  • “Who is the recipient in this contract?”
  • “What’s the total amount in this invoice?”

🧠 Redis Vector Index (Manual Option)

You can also manually create a vector index in Redis:

docker run -d --name redis-stack -p 6379:6379 -p 8001:8001 redis/redis-stack
Enter fullscreen mode Exit fullscreen mode

RedisInsight is available at: http://localhost:8001

🔮 Future Improvements

Here's what I plan to build next:

  • 🧾 OCR Support: Use tesseract.js to support scanned PDFs.
  • 📄 Multi-format Support: Accept .docx, .csv, .txt files.
  • 🔐 User Authentication: Secure access and user-based storage.
  • 🗃 Document History: Save uploaded files, questions, and answers.
  • 🧠 LangChain Integration: For better orchestration and multi-step reasoning.

🔐 Quick Security Tips

  • Validate file types (no .exe, etc.).
  • Add rate limiting on question API.
  • Use file size and token limits.

📦 Quickstart (Backend)

git clone <your-repo>
cd backend
npm install
touch .env
Enter fullscreen mode Exit fullscreen mode

.env file:

OPENAI_API_KEY=your_openai_key
REDIS_URL=redis://localhost:6379
PORT=5000
Enter fullscreen mode Exit fullscreen mode

Start Redis:

docker run -d --name redis-stack -p 6379:6379 -p 8001:8001 redis/redis-stack
Enter fullscreen mode Exit fullscreen mode

Run the backend:

npm start
Enter fullscreen mode Exit fullscreen mode

🧠 Final Thoughts

Building this Document Q&A Assistant gave me real hands-on experience with modern AI workflows, vector databases, and LLM integration. It’s a strong foundation for many real-world applications like:

  • Legal document search
  • Financial report summarization
  • Academic paper exploration

This is just the beginning. With OCR, better file type support, and LangChain, the possibilities are endless.

Code repo can be found below on my github

Frontend Repo
Backend Repo

👋 About Me
I’m a frontend developer working with Angular, React, and AI-assisted tools. I love writing clean, scalable code and sharing what I learn along the way.

Let’s connect in the comments — and don’t forget to ❤️ the post if you found it useful!

Top comments (0)