DEV Community

Cover image for Building a Production-Ready AI Customer Service Agent in NodeJS
Muhammad Arslan
Muhammad Arslan

Posted on • Edited on

Building a Production-Ready AI Customer Service Agent in NodeJS

How we evolved the Agent CSR example into an HCEL-first architecture, while preserving reliability with runtime fallback.

If you like it, do not forget to star the HazelJS repository.


Why we updated this example

The old CSR example already had strong agent tooling, RAG, and streaming.

The new version focuses on one major improvement:

  • HCEL-first orchestration in the service layer (@hazeljs/ai)
  • AgentRuntime fallback for compatibility and safer migration
  • Cleaner service/controller/gateway internals without changing API contracts

This means you get simpler code paths for new projects, but still keep proven runtime behavior for edge cases.


The use case: AI customer support that can act

Support agents in production need more than text generation:

  • Knowledge grounding from docs/policies (RAG)
  • Action execution through typed tools (orders, inventory, refunds, tickets)
  • Safety controls for risky actions (human approval)
  • Real-time UX (SSE + WebSocket streaming)

The CSR example is designed to show all of these in one runnable app.


What changed in architecture

1) HCEL-first service execution

CSRService.chat() and CSRService.chatStream() now run through:

await ai.hazel
  .context({ sessionId, userId })
  .prompt(`Customer request: ${message}`)
  .agent('csr-agent')
  .execute();
Enter fullscreen mode Exit fullscreen mode

If this HCEL path throws, the service falls back to:

  • runtime.execute(...) for sync calls
  • runtime.executeStream(...) for streaming calls

This gives a smooth migration path from runtime-first designs to composable HCEL flows.

2) Agent registration safety

During hot reload, agent registration could fail with "already registered".

Now registration is duplicate-aware so dev bootstrap is stable.

3) Simpler internal wiring

  • Response mapping is centralized in one helper
  • Queue initialization is isolated
  • Controller/gateway removed repeated stream/error formatting code

External behavior remains the same.


What the CSR agent can do

The CSRAgent still uses @Agent + @Tool with memory and RAG enabled.

Tool Purpose Approval
lookupOrder Get order status/details No
checkInventory Check stock/restock info No
processRefund Process refund with amount/reason Yes
updateShippingAddress Update shipping address Yes
createTicket Create support escalation ticket No
searchKnowledgeBase Query ingested docs/policies No

Sensitive operations (processRefund, updateShippingAddress) require approval through POST /api/csr/approve.


Why HCEL + @hazeljs/ai is the key upgrade

The practical benefits of HCEL here:

  • One composable chain for orchestration
  • Clear execution order that is easier to debug
  • Incremental complexity (start with prompt, add context, then agent/tools)
  • Less scattered AI logic across files

The practical benefits of @hazeljs/ai here:

  • Unified platform object (HazelAI) for provider + orchestration + RAG integration
  • Shared behavior between sync and streaming paths
  • Easier future expansion (.ml(), .observe(), .cache(), .persist()) without major rewrites

API surface (unchanged)

  • POST /api/csr/chat
  • POST /api/csr/chat/stream
  • POST /api/csr/ingest
  • POST /api/csr/approve
  • GET /api/csr/health
  • WebSocket: ws://localhost:3001/csr

HazelJS Agent CSR Example

Full-fledged Agent CSR (Customer Service Representative) example using @hazeljs packages: Agent, AI, RAG, Memory, Queue, WebSocket.

Features

  • AI Agent - Stateful CSR agent with tools (order lookup, inventory, refunds, tickets, knowledge search)
  • RAG - Retrieval-augmented generation for FAQ and documentation
  • Memory - Conversation memory with BufferMemory (dev) / HybridMemory (prod)
  • Approval Workflow - Human-in-the-loop for refunds and address updates
  • REST API - POST /api/csr/chat, /api/csr/chat/stream, /api/csr/ingest, /api/csr/approve
  • WebSocket - Real-time chat at ws://localhost:3001/csr
  • Queue - Optional async ticket creation (Redis/BullMQ)
  • Production - Rate limiting, circuit breaker, retry, health checks

Quick Start

# Install dependencies
npm install

# Set OpenAI API key (required)
export OPENAI_API_KEY=your-key

# Run
npm run dev
Enter fullscreen mode Exit fullscreen mode

HCEL (Hazel Composable Expression Language)

This example is designed to be read alongside the docs guide and the code in src/csr/.

Important: the runnable CSR server in src/csr/ now executes chat through an HCEL-first path…



Quick start (updated)

cd hazeljs-csr-example
npm install
export OPENAI_API_KEY="your-key"
npm run dev
Enter fullscreen mode Exit fullscreen mode

Then test:

  1. Health: GET /api/csr/health
  2. Ingest: POST /api/csr/ingest
  3. Chat: POST /api/csr/chat
  4. Stream: POST /api/csr/chat/stream
  5. Approve sensitive tool calls: POST /api/csr/approve

RAG and storage options

The example auto-selects vector store by env:

  • PINECONE_API_KEY -> Pinecone
  • else QDRANT_URL -> Qdrant
  • else -> in-memory

This keeps local setup fast while preserving a production path.


What this demonstrates about HazelJS

HazelJS gives AI-native backend primitives as first-class modules:

Instead of stitching unrelated libraries manually, you compose modules with a consistent framework model (DI, decorators, modules, controllers).


Final takeaway

The updated CSR example shows a practical migration pattern many teams need:

  • adopt HCEL-first for cleaner, composable orchestration
  • keep runtime fallback for reliability and backward compatibility
  • preserve existing API contracts while reducing code complexity

If you are building AI-native support backends, this is a solid foundation to start from.

If you like it, do not forget to star HazelJS on GitHub.

Top comments (0)