DEV Community

Cover image for AI+UX: How to Bridge UI and Talk Seamlessly
Mehran Davoudi
Mehran Davoudi

Posted on

AI+UX: How to Bridge UI and Talk Seamlessly

In the evolving landscape of conversational interfaces, the line between UI and Talk is blurring, and CrystaCode.ai is a prime example of what’s possible when AI meets thoughtful UX design.

CrystaCode is a conversational AI platform (based in Dubai!) that interacts with users either through text-based chat or live voice. But what makes it truly exciting isn’t just the dual modality, it’s the unified infrastructure powering both experiences.

🧠 One Brain, Two Mouths: Unified AI Infrastructure

At the heart of CrystaCode is a shared backend that serves both chat and talk interactions. This isn’t just a convenience, it’s a design philosophy. By using the same orchestration logic, prompt engineering, and AI pipelines, we ensure consistent behavior and reduce duplication across modalities.

To achieve this, we leveraged a couple of technologies:

  • Model Context Protocol (MCP): A flexible orchestration layer that abstracts AI interactions into composable units.
  • Microsoft.Extensions.AI.Abstractions: This abstraction layer allows us to plug in different AI providers while maintaining a consistent interface for both chat and voice.

This architecture means that whether a user types a question or speaks it aloud, the same intelligent backend processes the request, applies context, and returns a response.

🔊 Real-Time Voice with SignalR

Voice interactions demand low latency and real-time streaming. That’s where SignalR comes in. We use it to stream audio between the client and server, enabling live conversations with minimal delay.

SignalR’s streaming model fits perfectly with our AI pipeline, allowing us to push intermediate responses, handle interruptions, and maintain a fluid user experience.

🧩 Blazor for UI and Code Reusability

On the frontend, Blazor helps us maximize code reuse across chat and voice interfaces. With Blazor’s component model, we can build shared UI elements, manage state consistently, and even reuse validation and business logic across both modalities. For Blazor, we are using Bit Platform as we believe it has the state-of-the-art configuration to get the best from Blazor, both in functionality and performance.

This means faster development cycles and fewer bugs, plus a more cohesive user experience.

☁️ Azure OpenAI for Scale and Flexibility

To power the intelligence behind CrystaCode, we use Azure OpenAI. It gives us:

  • Access to cutting-edge language models
  • Enterprise-grade scalability
  • Fine-tuning and deployment flexibility

Combined with MCP, we can route requests to different models based on context, user profile, or even modality, without changing the frontend or orchestration logic.

✅ Semantic Integration Testing with skUnit

Building a conversational AI that feels natural across both chat and voice isn’t just about infrastructure; it’s about precision. That’s why we use skUnit, a testing framework purpose-built for semantic validation of AI interactions.

With skUnit, we create end-to-end integration tests that simulate real user conversations and validate the AI’s responses not just syntactically, but semantically. This means we can:

  • Assert that the AI understands context across turns
  • Validate that the tone, intent, and structure of responses match expectations
  • Catch regressions in prompt behavior or orchestration logic early

This level of testing is especially powerful when combined with MCP orchestrations, allowing us to simulate full conversational flows and verify outcomes across different model configurations and user scenarios.

By embedding skUnit into our CI pipeline, we’ve turned conversational quality into a measurable, repeatable discipline, raising the bar for what AI UX can deliver.

✨ The UX Payoff

This tech stack isn’t just elegant, it’s user-centric. By unifying the backend and optimizing the frontend, we deliver:

  • Seamless transitions between chat and voice
  • Faster response times
  • Consistent personality and tone across modalities
  • A scalable foundation for future features like multimodal input or emotion-aware responses

CrystaCode isn’t just an app; it’s a blueprint for how AI and UX can co-evolve.

Top comments (0)