Customer support questions span structured data (orders, products ποΈ), unstructured knowledge (docs/FAQs π), and live systems (shipping updates π). In this post weβll ship a compact AI agent that handles all threeβusing:
- π§ Python + smolagents to orchestrate the agentβs βbrainβ
- π§° InterSystems IRIS for SQL, Vector Search (RAG), and Interoperability (a mock shipping status API)
β‘ TL;DR (snack-sized)
- Build a working AI Customer Support Agent with Python + smolagents orchestrating tools on InterSystems IRIS (SQL, Vector Search/RAG, Interoperability for a mock shipping API).
- It answers real questions (e.g., βWas order #1001 delivered?β βWhatβs the return window?β) by combining tables, documents, and interoperability calls.
- Youβll spin up IRIS in Docker, load schema and sample data, embed docs for RAG, register tools (SQL/RAG/API), and run the agent via CLI or Gradio UI.
π§ What youβll build
An AI Customer Support Agent that can:
- π Query structured data (customers, orders, products, shipments) via SQL
- π Retrieve unstructured knowledge (FAQs & docs) via RAG on IRIS Vector Search
- π Call a (mock) shipping API via IRIS Interoperability, with Visual Trace to inspect every call
Architecture (at a glance)
User β Agent (smolagents CodeAgent)
ββ SQL Tool β IRIS tables
ββ RAG Tool β IRIS Vector Search (embeddings + chunks)
ββ Shipping Tool β IRIS Interoperability (mock shipping) β Visual Trace
New to smolagents? Itβs a tiny agent framework from Hugging Face where the model plans and uses your toolsβother alternatives are LangGraph and LlamaIndex.
π§± Prerequisites
- π Python 3.9+
- π³ Docker to run IRIS in a container
- π§βπ» VS Code handy to checkout the code
- π OpenAI API key for the LLM + embeddings β or run locally with Ollama if you prefer
1) π§© Clone & set up Python
git clone https://github.com/intersystems-ib/customer-support-agent-demo
cd customer-support-agent-demo
python -m venv .venv
# macOS/Linux
source .venv/bin/activate
# Windows (PowerShell)
# .venv\Scripts\Activate.ps1
pip install -r requirements.txt
cp .env.example .env # add your OpenAI key
2) π³ Start InterSystems IRIS (Docker)
docker compose build
docker compose up -d
Open the Management Portal (http://localhost:52773 in this demo).
3) ποΈ Load the structured data (SQL)
From SQL Explorer (Portal) or your favorite SQL client:
LOAD SQL FROM FILE '/app/iris/sql/schema.sql' DIALECT 'IRIS' DELIMITER ';';
LOAD SQL FROM FILE '/app/iris/sql/load_data.sql' DIALECT 'IRIS' DELIMITER ';';
This is the schema you have just loaded:
Run some queries and get familiar with the data. The agent will use this data to resolve questions:
-- List customers
SELECT * FROM Agent_Data.Customers;
-- Orders for a given customer
SELECT o.OrderID, o.OrderDate, o.Status, p.Name AS Product
FROM Agent_Data.Orders o
JOIN Agent_Data.Products p ON o.ProductID = p.ProductID
WHERE o.CustomerID = 1;
-- Shipment info for an order
SELECT * FROM Agent_Data.Shipments WHERE OrderID = 1001;
β If you see rows, your structured side is ready.
4) π Add unstructured knowledge with Vector Search (RAG)
Create an embedding config (example below uses an OpenAI embedding modelβtweak to taste):
INSERT INTO %Embedding.Config
(Name, Configuration, EmbeddingClass, VectorLength, Description)
VALUES
('my-openai-config',
'{"apiKey":"YOUR_OPENAI_KEY","sslConfig":"llm_ssl","modelName":"text-embedding-3-small"}',
'%Embedding.OpenAI',
1536,
'a small embedding model provided by OpenAI');
Need the exact steps and options? Check the documentation
Then embed the sample content:
python scripts/embed_sql.py
Check the embeddings are already in the tables:
SELECT COUNT(*) AS ProductChunks FROM Agent_Data.Products;
SELECT COUNT(*) AS DocChunks FROM Agent_Data.DocChunks;
π Bonus: Hybrid + vector search directly from SQL with EMBEDDING()
A major advantage of IRIS is that you can perform semantic (vector) search right inside SQL and mix it with classic filtersβno extra microservices needed. The EMBEDDING() SQL function generates a vector on the fly for your query text, which you can compare against stored vectors using operations like VECTOR_DOT_PRODUCT.
Example A β Hybrid product search (price filter + semantic ranking):
SELECT TOP 3
p.ProductID,
p.Name,
p.Category,
p.Price,
VECTOR_DOT_PRODUCT(p.Embedding, EMBEDDING('headphones with ANC', 'my-openai-config')) score
FROM Agent_Data.Products p
WHERE p.Price < 200
ORDER BY score DESC
Example B β Semantic doc-chunk lookup (great for feeding RAG answers):
SELECT TOP 3
c.ChunkID AS chunk_id,
c.DocID AS doc_id,
c.Title AS title,
SUBSTRING(c.ChunkText, 1, 400) AS snippet,
VECTOR_DOT_PRODUCT(c.Embedding, EMBEDDING('warranty coverage', 'my-openai-config')) AS score
FROM Agent_Data.DocChunks c
ORDER BY score DESC
Why this is powerful: you can pre-filter by price, category, language, tenant, dates, etc., and then rank by semantic similarityβall in one SQL statement.
5) π Wire a live (mock) shipping API with Interoperability
The project exposes a tiny /api/shipping/status endpoint through IRIS Interoperabilityβperfect to simulate βreal worldβ calls:
curl -H "Content-Type: application/json" \
-X POST \
-d '{"orderStatus":"Processing","trackingNumber":"DHL7788"}' \
http://localhost:52773/api/shipping/status
Now open Visual Trace in the Portal to watch the message flow hop-by-hop (itβs like airport radar for your integration βοΈ).
6) π€ Meet the agent (smolagents + tools)
Peek at these files:
-
agent/customer_support_agent.pyβ boots a CodeAgent and registers tools -
agent/tools/sql_tool.pyβ parameterized SQL helpers -
agent/tools/rag_tool.pyβ vector search + doc retrieval -
agent/tools/shipping_tool.pyβ calls the Interoperability endpoint
The CodeAgent plans with short code steps and calls your tools. You bring the tools; it brings the brains using a LLM model
7) βΆοΈ Run it!
One-shot (quick tests)
python -m cli.run --email alice@example.com --message "Where is my order #1001?"
python -m cli.run --email alice@example.com --message "Show electronics that are good for travel"
python -m cli.run --email alice@example.com --message "Was my headphones order delivered, and whatβs the return window?"
Interactive CLI
python -m cli.run --email alice@example.com
Web UI (Gradio)
python -m ui.gradio
# open http://localhost:7860
π οΈ Under the hood
The agentβs flow (simplified):
π§ Plan how to resolve the question and what available tools must be used: e.g., βcheck order status β fetch returns policy β.
π€οΈ Call tools as needed
- ποΈ SQL for customers/orders/products
- π RAG over embeddings for FAQs/docs (and remember, you can prototype RAG right inside SQL using
EMBEDDING()+ vector ops as shown above) - π Interoperability API for shipping status
- π§© Synthesize: stitch results into a friendly, precise answer.
Add or swap tools as your use case grows: promotions, warranties, inventory, you name it.
π Wrap-up
You now have a compact AI Customer Support Agent that blends:
- π§ LLM reasoning (smolagents CodeAgent)
- ποΈ Structured data (IRIS SQL)
- π Unstructured knowledge (IRIS Vector Search + RAG) β with the bonus that
EMBEDDING()lets you do hybrid + vector search directly from SQL - π Live system calls (IRIS Interoperability + Visual Trace)


Top comments (0)