DEV Community

Cover image for Copy This Rive Setup for Your AI App (Step-by-Step)
Praneeth Kawya Thathsara
Praneeth Kawya Thathsara

Posted on

Copy This Rive Setup for Your AI App (Step-by-Step)

Copy This Rive Setup for Your AI App (Step-by-Step)

Most AI applications today still rely on static UI elements like loaders, typing indicators, or basic transitions. These approaches do not fully communicate system state, leading to user uncertainty and reduced trust.

This guide provides a production-ready Rive setup you can directly apply to your AI application. It focuses on building a reusable animation system driven by real AI events such as listening, thinking, and responding.

The goal is not to create a demo, but to establish a scalable animation architecture that integrates cleanly with your app logic.

What You Will Build

A reusable Rive-based AI assistant system with:

  • Idle state (ready)
  • Listening state (input active)
  • Thinking state (processing)
  • Speaking state (output delivery)
  • Clean state machine logic
  • Developer-friendly input mapping

This setup works across:

  • Web applications
  • Flutter apps
  • React Native apps

Step 1: Create the Base Orb in Rive

Start with a minimal and performance-friendly structure.

Design guidelines:

  • Use a single circle as the base shape
  • Apply a gradient fill for depth
  • Add a soft outer glow layer
  • Keep vector layers optimized (avoid heavy blur stacking)

Optional layers:

  • Inner core for subtle motion
  • Outer ring for state-based effects

Keep your structure simple. Complex visuals should not compromise runtime performance.

Step 2: Define Core Animation States

Create separate animations for each AI state.

Idle

Purpose:

  • Communicate readiness without distraction

Animation:

  • Slight scale loop (e.g., 0.98 → 1.02)
  • Soft opacity or glow variation

Listening

Purpose:

  • Indicate active input capture

Animation:

  • Increased glow intensity
  • Pulsing expansion
  • Optional ripple effect

Thinking

Purpose:

  • Represent processing

Animation:

  • Rotational motion (outer ring or particles)
  • Color shift for visual distinction
  • Continuous loop with smooth easing

Speaking

Purpose:

  • Represent output delivery

Animation:

  • Scale driven by audioLevel input
  • Wave-like expansion
  • Controlled bounce or pulse

Ensure all states share a consistent visual language.

Step 3: Build the State Machine

Create a state machine inside Rive.

Artboard: AI_Orb

State Machine: Orb_SM

States:

  • Idle
  • Listening
  • Thinking
  • Speaking

Inputs:

  • isListening (boolean)
  • isThinking (boolean)
  • isSpeaking (boolean)
  • audioLevel (number)

Transition Logic

  • Idle → Listening when isListening = true
  • Listening → Thinking when isListening = false and isThinking = true
  • Thinking → Speaking when isSpeaking = true
  • Speaking → Idle when isSpeaking = false and isThinking = false

Keep conditions simple and avoid overlapping transitions.

Step 4: Connect AI Events to Rive Inputs

The animation system should not contain business logic. Your application should control all state changes.

Web Integration Example

import { Rive } from "@rive-app/canvas";

const rive = new Rive({
    src: "/ai-orb.riv",
    canvas: document.getElementById("canvas"),
    autoplay: true,
    stateMachines: "Orb_SM",
    onLoad: () => {
        const inputs = rive.stateMachineInputs("Orb_SM");

        const isListening = inputs.find(i => i.name === "isListening");
        const isThinking = inputs.find(i => i.name === "isThinking");
        const isSpeaking = inputs.find(i => i.name === "isSpeaking");

        agent.on("input_start", () => {
            isListening.value = true;
        });

        agent.on("input_end", () => {
            isListening.value = false;
            isThinking.value = true;
        });

        agent.on("response_start", () => {
            isThinking.value = false;
            isSpeaking.value = true;
        });

        agent.on("response_end", () => {
            isSpeaking.value = false;
        });
    }
});
Enter fullscreen mode Exit fullscreen mode

This pattern keeps your UI responsive to real AI activity.

Step 5: Flutter Integration Example

final riveFile = await RiveFile.asset('assets/ai_orb.riv');
final artboard = riveFile.mainArtboard;
final controller = StateMachineController.fromArtboard(
    artboard,
    'Orb_SM',
);

if (controller != null) {
    artboard.addController(controller);

    final isListening = controller.findInput<bool>('isListening');
    final isThinking = controller.findInput<bool>('isThinking');
    final isSpeaking = controller.findInput<bool>('isSpeaking');

    aiAgent.onInputStart(() {
        isListening?.value = true;
    });

    aiAgent.onInputEnd(() {
        isListening?.value = false;
        isThinking?.value = true;
    });

    aiAgent.onResponseStart(() {
        isThinking?.value = false;
        isSpeaking?.value = true;
    });

    aiAgent.onResponseEnd(() {
        isSpeaking?.value = false;
    });
}
Enter fullscreen mode Exit fullscreen mode

This architecture scales well for mobile applications.

Step 6: Add Audio Reactivity (Optional but Recommended)

To improve realism, connect audio output to the animation.

Use a numeric input:

  • audioLevel (0 to 1)

Then update it in real time:

const audioLevel = inputs.find(i => i.name === "audioLevel");

audioEngine.onLevelChange((level) => {
    audioLevel.value = level;
});
Enter fullscreen mode Exit fullscreen mode

Design the speaking animation to respond smoothly to this value. Use interpolation to avoid jitter.

Step 7: Production Best Practices

Keep Logic Outside Rive

  • Rive handles visuals only
  • All AI logic stays in your app

Optimize for Performance

  • Reduce vector complexity
  • Avoid excessive blur layers
  • Test on low-end devices

Maintain State Clarity

  • Each state must be visually distinct
  • Avoid ambiguous transitions

Handle Edge Cases

  • Interruptions during speaking
  • Errors during processing
  • Reset states cleanly

Use Consistent Timing

  • Align animation speed with AI response timing
  • Avoid long static states

Common Mistakes

  • Embedding logic inside Rive instead of the app
  • Overcomplicating the state machine
  • Ignoring real AI latency patterns
  • Using animation as decoration instead of communication
  • Not planning for error states

A well-structured Rive setup transforms AI interfaces from static experiences into responsive systems that communicate clearly with users.

By following this approach, you can build an AI assistant UI that:

  • Feels responsive and alive
  • Reflects real system behavior
  • Scales across platforms
  • Integrates cleanly with your application logic

This is not just animation. It is a system layer that improves usability and trust in AI products.

About the Author

Praneeth Kawya Thathsara

UI Animation Specialist · Rive Animator

Domains operated by Praneeth Kawya Thathsara:

website www.mascotengine.com

Praneeth Kawya Thathsara works remotely with global teams, designing production-ready UI animation systems, AI assistant interfaces, and interactive motion experiences.

Contact:

Email: mascotengine@gmail.com

Email: riveanimator@gmail.com

WhatsApp: +94 71 700 0999

Social:

Instagram: instagram.com/mascotengine

X (Twitter): x.com/mascotengine

LinkedIn: https://www.linkedin.com/in/praneethkawyathathsara/

All listed domains are owned and operated by Praneeth Kawya Thathsara.

If you are building an AI product and need a scalable Rive animation system, interactive assistant UI, or mascot-based interface, you can reach out for collaboration on production-ready solutions.

Top comments (0)