DEV Community

Aniket Hingane
Aniket Hingane

Posted on

Building an AI-Powered E-Commerce Platform with Rich UI Rendering

Building an AI-Powered E-Commerce Platform with Rich UI Rendering

TL;DR

I built an experimental e-commerce platform where the AI assistant doesn't just chat—it renders interactive UI widgets directly in the conversation. When you ask "add Smart Watch to my cart," you get a beautiful product card with live pricing. Request "show me a bar chart of my cart," and you see an actual SVG chart rendered in real-time. This PoC explores CopilotKit integrated with Azure OpenAI (not regular OpenAI), implementing rich rendering capabilities including product cards, shopping cart widgets, bar charts, and pie charts—all displayed natively within the chat interface.


Introduction

What's This Article About?

When I first started experimenting with AI-powered user interfaces, I thought the possibilities were limited to text-based responses. Then I discovered CopilotKit's rich rendering capabilities, and everything changed. In my opinion, the future of AI interactions isn't just about getting text answers—it's about receiving interactive, visual components that you can actually use.

This article documents my journey building an experimental AI shopping assistant that broke all my assumptions about what chat interfaces could do. I'm not talking about a production system here; this is purely a proof-of-concept where I pushed the boundaries of what I thought was possible.

From my experience with this experiment, I learned that modern AI frameworks like CopilotKit can render complete React components in chat responses. When a user asks for cart analytics, the AI doesn't just return "You have 3 items totaling $299.97"—it renders an interactive dashboard with bar charts, category breakdowns, and real-time calculations.

Tech Stack

Here's what I assembled for this experimental platform:

  • Frontend: Next.js 15.0.0 (App Router)
  • AI Framework: CopilotKit 1.10.6
  • AI Backend: Azure OpenAI (GPT-4.1-mini deployment)
  • Styling: Tailwind CSS 3.4.17
  • Language: TypeScript
  • State Management: React hooks (useState, useRef, useEffect)
  • Charts: Custom SVG implementations

In my experiments, I chose this stack because I wanted something modern that could handle both server-side rendering and rich client interactions. As per my understanding, CopilotKit works beautifully with Next.js's App Router pattern.

Why Read It?

From my perspective, this article offers value in several ways:

1. Azure OpenAI Integration: Most tutorials show regular OpenAI setup. I put together the Azure OpenAI configuration because, in my experience, many enterprises use Azure. I'll show you exactly how I configured it.

2. Rich UI Rendering: I thought chat responses had to be text. I was wrong. I'll demonstrate how I implemented product cards, charts, and interactive widgets that render directly in chat conversations.

3. State Management Challenges: I encountered fascinating problems with React closures and state updates. I think sharing these challenges will save you hours of debugging.

4. Practical E-Commerce Use Case: This isn't a todo list demo. I built a real shopping cart with add/remove functionality, analytics, and visual data representations.

5. Complete Source Code: Everything I experimented with is documented here with actual working code from my project.


Let's Design

The Architecture I Envisioned

When I sat down to design this system, I thought about what would make a truly intelligent shopping assistant. In my opinion, the magic happens when three layers work seamlessly together:

The Three-Layer Architecture:

  1. Presentation Layer - Next.js frontend with product grid, cart display, and CopilotSidebar
  2. Actions Layer - CopilotKit actions (addToCart, showBarChart, showPieChart, etc.)
  3. AI Layer - Azure OpenAI API route with CopilotRuntime

From my experimentation, I found this layered approach crucial. Each layer has a clear responsibility, and I think this separation made debugging much easier.

The Data Flow I Implemented

I designed the data flow to handle both actions (user commands) and rendering (visual responses). Here's how I put it together:

  1. User Input: "Add Smart Watch to my cart"
  2. AI Processing: Azure OpenAI understands intent
  3. Action Execution: useCopilotAction handler runs
  4. State Update: React state changes with new cart item
  5. UI Rendering: Custom render function creates visual card
  6. Display: Interactive widget appears in chat

In my opinion, the render function is where the magic happens. I thought it would be complex, but CopilotKit's API made it surprisingly straightforward.


Let's Get Cooking

Part 1: Setting Up Azure OpenAI Integration

The first thing I tackled was connecting CopilotKit to Azure OpenAI instead of regular OpenAI. In my experience, this required creating a custom API route. Here's exactly how I did it.

File: src/app/api/copilotkit/route.ts

import {
  CopilotRuntime,
  OpenAIAdapter,
  copilotRuntimeNextJSAppRouterEndpoint,
} from "@copilotkit/runtime";
import { AzureOpenAI } from "openai";
import { NextRequest } from "next/server";

export const POST = async (req: NextRequest) => {
  // I grabbed these from environment variables
  const apiKey = process.env.AZURE_OPENAI_API_KEY;
  const endpoint = process.env.AZURE_OPENAI_ENDPOINT;
  const deploymentName = process.env.AZURE_OPENAI_DEPLOYMENT_NAME;
  const apiVersion = process.env.AZURE_OPENAI_API_VERSION || "2024-02-15-preview";

  // I added validation to catch configuration errors early
  if (!apiKey || !endpoint || !deploymentName) {
    return new Response(
      JSON.stringify({ 
        error: "Azure OpenAI not configured" 
      }), 
      { status: 500, headers: { "Content-Type": "application/json" }}
    );
  }

  // This is where I created the Azure OpenAI client
  const openai = new AzureOpenAI({
    apiKey,
    endpoint,
    deployment: deploymentName,
    apiVersion,
  });

  // I configured CopilotKit to use Azure's adapter
  const { handleRequest } = copilotRuntimeNextJSAppRouterEndpoint({
    runtime: new CopilotRuntime(),
    serviceAdapter: new OpenAIAdapter({ 
      openai,
      model: deploymentName,
    }),
    endpoint: "/api/copilotkit",
  });

  return handleRequest(req);
};
Enter fullscreen mode Exit fullscreen mode

Why I wrote it this way:

From my understanding, the key difference between Azure OpenAI and regular OpenAI is the initialization. I needed to pass the deployment parameter instead of just a model name. I think this trips up many developers, so I made sure to document it clearly.

The validation block I added saved me hours of debugging. In my experiments, misconfigured endpoints would fail silently. I put in explicit error messages so I'd know immediately what was wrong.

Environment Configuration:

I stored my Azure credentials in .env.local:

AZURE_OPENAI_API_KEY=your-azure-key-here
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_DEPLOYMENT_NAME=gpt-4.1-mini
AZURE_OPENAI_API_VERSION=2024-12-01-preview
Enter fullscreen mode Exit fullscreen mode

This article documents a personal experimental project and proof-of-concept.

Top comments (0)