DEV Community

Danilo Jamaal
Danilo Jamaal

Posted on • Edited on

Build a Voice-Powered Crypto AI Agent with Next.js + Google Gemini + LunarCrush MCP in 25 Minutes

Build a Voice-Powered Crypto AI Agent with Next.js + Google Gemini + LunarCrush MCP in 25 Minutes

Transform cryptocurrency research with AI-powered voice interface and real-time market intelligence

Voice Crypto Assistant Demo

Why Voice + AI Changes Everything for Crypto Research

Traditional crypto analysis requires endless scrolling through charts, manually correlating social sentiment with price movements, and juggling multiple data sources. This creates significant cognitive overhead and research fatigue.

Voice-powered AI with Model Context Protocol (MCP) revolutionizes how traders access real-time intelligence. Instead of manual research orchestration, AI intelligently combines voice commands with structured data connections to deliver instant, comprehensive insights.

Voice AI with Model Context Protocol (MCP) revolutionizes how we interact with market data. Instead of manual orchestration across multiple APIs, MCP creates secure, standardized connections between AI models and real-time data sources. Your AI can intelligently orchestrate multiple data tools, make complex decisions, and generate insights that would take hours to code manually.

This means you can literally ask "What's the sentiment on Bitcoin?" and get comprehensive analysis combining price data, social metrics, technical indicators, and AI insightsβ€”all through natural conversation.

What You'll Build

Voice Crypto Assistant Results

In this tutorial, you'll create a production-ready Voice Crypto Assistant that:

  • βœ… Voice-First Interface - Natural speech recognition for hands-free crypto research
  • βœ… AI-Powered Detection - Google Gemini intelligently extracts cryptocurrency symbols from natural language
  • βœ… MCP Integration - Direct connection between Google Gemini AI and LunarCrush social intelligence tools
  • βœ… Real-time Progress - Live analysis tracking through a multi-step AI pipeline
  • βœ… Interactive Visualizations - Beautiful Material-UI components with responsive design
  • βœ… Advanced Voice Controls - Voice selection, speed control, volume control, pause/resume
  • βœ… Smart Editing - Immediate edit functionality when voice recognition needs correction
  • βœ… Professional UI - Dark theme optimized for trading and financial analysis

Time Investment: 25 minutes
Skill Level: Beginner to Intermediate
What You'll Learn: Next.js, TypeScript, MCP integration, Voice APIs, AI orchestration, production deployment

πŸ’‘ Pro Tip: By the end, you'll have a portfolio-worthy project that demonstrates modern AI development patterns with voice interfaces!

Live Demo: View the deployed version β†’


Before We Start

You'll Need:

  • Node.js 18+ installed
  • Basic knowledge of React/TypeScript/Next.js
  • A code editor (VS Code recommended)
  • Microphone access for voice features
  • 2 API keys from different services (we'll walk through signup below)

Two Ways to Experience This Tutorial:

  1. πŸ‘¨β€πŸ’» Build It Yourself - Follow along step-by-step with your own API keys
  2. πŸš€ Try the Live Demo - View the deployed version and explore the code

Quick Project Setup:

# We'll build this step-by-step, but here's the final structure:
npx create-next-app@latest voice-crypto-assistant --typescript --tailwind --eslint --app
cd voice-crypto-assistant
npm install @google/generative-ai @modelcontextprotocol/sdk @mui/material @mui/icons-material
Enter fullscreen mode Exit fullscreen mode

🚨 Common Issue: Make sure you have Node.js 18+ installed. Check with node --version

Account Setup Guide

We need 2 services for this project. Both have generous free tiers!

Sign Up For LunarCrush API

LunarCrush provides social sentiment data that most traders don't have access to through their advanced MCP server integration.

Use my discount referral code JAMAALBUILDS to receive 15% off your plan.

  1. Visit LunarCrush Signup
  2. Enter your email address and click "Continue"
  3. Check your email for verification code and enter it
  4. Complete the onboarding steps:
    • Select your favorite categories (or keep defaults)
    • Create your profile (add photo and nickname if desired)
    • Important: Select a subscription plan (you'll need it to generate an API key)

Generate Your API Key

Once you've subscribed, navigate to the API authentication page and generate an API key.

Save this API key - you'll add it to your environment variables later.

Set Up Google Gemini AI

Google's Gemini AI will handle voice understanding, crypto detection, and intelligent tool orchestration.

  1. Sign up: Visit aistudio.google.com and click "Get API key"
  2. Choose authentication: Sign in with your Google account
  3. Create API key:
    • Click "Create API key"
    • Choose "Create API key in new project" or select existing project
    • Copy your API key (starts with AIza...)

Environment Setup

Create .env.local:

# LunarCrush API (Required)
LUNARCRUSH_API_KEY=lc_your_key_here

# Google Gemini AI (Required)
GEMINI_API_KEY=your_gemini_key_here

# Optional: Enable debug mode
DEBUG=false
Enter fullscreen mode Exit fullscreen mode

Project Setup

Now let's build our Voice Crypto Assistant step by step.

Create Next.js Project

# Create new Next.js project with TypeScript
npx create-next-app@latest voice-crypto-assistant --typescript --tailwind --eslint --app
cd voice-crypto-assistant

# Install required dependencies
npm install @google/generative-ai @modelcontextprotocol/sdk @mui/material @mui/icons-material @emotion/react @emotion/styled @mui/material-nextjs react-speech-recognition regenerator-runtime

# Install TypeScript types
npm install --save-dev @types/react-speech-recognition

# Create environment file
touch .env.local
Enter fullscreen mode Exit fullscreen mode

Set Up Environment Variables

Add your API keys to .env.local:

# .env.local
LUNARCRUSH_API_KEY=lc_your_key_here
GEMINI_API_KEY=your_gemini_key_here
DEBUG=false
Enter fullscreen mode Exit fullscreen mode

Create Project Structure (Copy/Paste Terminal Commands)

# Create directory structure
mkdir -p src/components src/hooks src/lib src/types


# Create formatting utilities
cat > src/lib/formatters.ts << 'EOF'
/**
 * Format large numbers into readable formats (10M, 100B, etc.)
 */
export function formatLargeNumber(
    value: string | number | null | undefined
): string {
    // Handle null/undefined values
    if (value === null || value === undefined) {
        return 'N/A';
    }

    // Handle string inputs that might have $ or commas
    const numStr = typeof value === 'string' ? value : value.toString();

    // Remove $ and commas to get clean number
    const cleanStr = numStr.replace(/[$,]/g, '');
    const num = parseFloat(cleanStr);

    // If it's not a valid number, return original string
    if (isNaN(num)) {
        return numStr;
    }

    // Handle special cases
    if (num === 0) return '0';
    if (num < 1000) return num.toFixed(0);

    const units = [
        { value: 1e12, symbol: 'T' }, // Trillion
        { value: 1e9, symbol: 'B' }, // Billion
        { value: 1e6, symbol: 'M' }, // Million
        { value: 1e3, symbol: 'K' }, // Thousand
    ];

    for (const unit of units) {
        if (num >= unit.value) {
            const formatted = (num / unit.value).toFixed(1);
            // Remove .0 if it's a whole number
            const clean = formatted.endsWith('.0')
                ? formatted.slice(0, -2)
                : formatted;
            return `${clean}${unit.symbol}`;
        }
    }

    return num.toFixed(0);
}

/**
 * Format currency values with appropriate scaling
 */
export function formatCurrency(
    value: string | number | null | undefined
): string {
    // Handle null/undefined values
    if (value === null || value === undefined) {
        return 'N/A';
    }

    const numStr = typeof value === 'string' ? value : value.toString();

    // If it already has $ and looks formatted, just scale it
    if (numStr.includes('$')) {
        const formatted = formatLargeNumber(numStr);
        return formatted.startsWith('$') ? formatted : `$${formatted}`;
    }

    // Otherwise add $ and format
    const formatted = formatLargeNumber(numStr);
    return `$${formatted}`;
}

/**
 * Format percentage values
 */
export function formatPercentage(
    value: string | number | null | undefined
): string {
    // Handle null/undefined values
    if (value === null || value === undefined) {
        return 'N/A';
    }

    const numStr = typeof value === 'string' ? value : value.toString();

    // Remove % if it exists
    const cleanStr = numStr.replace('%', '');
    const num = parseFloat(cleanStr);

    if (isNaN(num)) return numStr;

    // Format with 1 decimal place and add %
    return `${num.toFixed(1)}%`;
}

/**
 * Format whole numbers with commas (for smaller counts)
 */
export function formatCount(value: string | number | null | undefined): string {
    // Handle null/undefined values
    if (value === null || value === undefined) {
        return 'N/A';
    }

    const numStr = typeof value === 'string' ? value : value.toString();

    // Remove commas to get clean number
    const cleanStr = numStr.replace(/,/g, '');
    const num = parseFloat(cleanStr);

    if (isNaN(num)) return numStr;

    // For counts under 100K, show with commas
    if (num < 100000) {
        return num.toLocaleString();
    }

    // For larger counts, use abbreviated format
    return formatLargeNumber(num);
}

/**
 * Smart formatter that automatically chooses the best format
 */
export function smartFormat(
    value: string | number | null | undefined,
    type?: 'currency' | 'percentage' | 'count'
): string {
    // Handle null/undefined values
    if (value === null || value === undefined) {
        return 'N/A';
    }

    if (type === 'currency') return formatCurrency(value);
    if (type === 'percentage') return formatPercentage(value);
    if (type === 'count') return formatCount(value);

    // Auto-detect based on value
    const str = typeof value === 'string' ? value : value.toString();

    if (str.includes('$')) return formatCurrency(value);
    if (str.includes('%')) return formatPercentage(value);

    return formatLargeNumber(value);
}
EOF
Enter fullscreen mode Exit fullscreen mode

Core Implementation (Copy/Paste Terminal Commands)

Create the Analysis API Route

# Create the main analysis API endpoint
cat > src/app/api/analyze/route.ts << 'EOF'
import { NextRequest } from 'next/server';
import { GoogleGenerativeAI } from '@google/generative-ai';
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { SSEClientTransport } from '@modelcontextprotocol/sdk/client/sse.js';

// Add request timeout for production
export const maxDuration = 60;

interface ToolCall {
    tool: string;
    args: Record<string, unknown>;
    reason: string;
    expected_data?: string;
    link_format?: string;
}

interface TradingAnalysis {
    symbol: string;
    recommendation: 'BUY' | 'SELL' | 'HOLD';
    confidence: number;
    reasoning: string;
    social_sentiment: 'bullish' | 'bearish' | 'neutral';
    key_metrics: Record<string, unknown>;
    ai_analysis: {
        summary: string;
        pros: string[];
        cons: string[];
        key_factors: string[];
    };
    timestamp: string;
    chart_data: Array<{ date: string; price: number }>;
    success: boolean;
}

export async function POST(request: NextRequest) {
    let client: Client | null = null;

    try {
        const { query } = await request.json();

        if (!query) {
            return new Response('{"error": "Query is required"}\n', {
                status: 400,
                headers: {
                    'Content-Type': 'application/json',
                    'Cache-Control': 'no-cache',
                    'Access-Control-Allow-Origin': '*',
                    'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
                    'Access-Control-Allow-Headers': 'Content-Type, Authorization',
                },
            });
        }

        // Initialize HTTP chunked streaming response
        const stream = new ReadableStream({
            start(controller) {
                const encoder = new TextEncoder();

                const send = (data: any) => {
                    controller.enqueue(encoder.encode(`${JSON.stringify(data)}\n`));
                };

                const sendError = (error: string) => {
                    send({
                        type: 'error',
                        message: 'Analysis failed',
                        speak:
                            'Sorry, I encountered an issue analyzing that. Please try again.',
                        error,
                        timestamp: Date.now(),
                    });
                    controller.close();
                };

                const processAnalysis = async () => {
                    try {
                        // Step 1: Get API keys first
                        const lunarKey = process.env.LUNARCRUSH_API_KEY;
                        const geminiKey =
                            process.env.GEMINI_API_KEY || process.env.GOOGLE_GEMINI_API_KEY;

                        if (!lunarKey || !geminiKey) {
                            sendError('API keys not configured');
                            return;
                        }

                        send({
                            type: 'progress',
                            message: 'Initializing crypto analysis...',
                            step: 1,
                            totalSteps: 7,
                            timestamp: Date.now(),
                        });

                        // Step 2: Let Gemini extract and determine the cryptocurrency from the query
                        const cryptoInfo = await extractCryptoFromQuery(query, geminiKey);
                        const symbol = cryptoInfo.symbol;
                        const fullName = cryptoInfo.fullName;

                        console.log(
                            `Starting streaming analysis for ${symbol} (${fullName})`
                        );

                        send({
                            type: 'progress',
                            message: `Analyzing ${fullName} (${symbol})...`,
                            step: 2,
                            totalSteps: 7,
                            symbol,
                            fullName,
                            timestamp: Date.now(),
                        });

                        // Step 3: Create and connect MCP client
                        client = await createMCPClient(lunarKey);
                        const genAI = new GoogleGenerativeAI(geminiKey);
                        const model = genAI.getGenerativeModel({
                            model: 'gemini-2.0-flash-lite',
                        });

                        console.log('MCP client initialized successfully');

                        send({
                            type: 'progress',
                            message: 'Connected to LunarCrush data sources...',
                            step: 3,
                            totalSteps: 7,
                            timestamp: Date.now(),
                        });

                        // Step 4: Get available tools
                        console.log(`Fetching available MCP tools...`);
                        const { tools } = await client.listTools();
                        console.log(
                            `Available MCP tools: ${tools.map((t: any) => t.name).join(', ')}`
                        );

                        send({
                            type: 'progress',
                            message: `Found ${tools.length} analysis tools available...`,
                            step: 4,
                            totalSteps: 7,
                            toolsAvailable: tools.length,
                            timestamp: Date.now(),
                        });

                        // Step 5: Let Gemini choose which tools to use with enhanced instructions
                        const orchestrationPrompt = createEnhancedOrchestrationPrompt(
                            symbol,
                            fullName,
                            tools
                        );
                        console.log(
                            `Letting Gemini choose tools for ${symbol} analysis...`
                        );

                        const orchestrationResult = await model.generateContent(
                            orchestrationPrompt
                        );
                        const orchestrationText = orchestrationResult.response.text();

                        send({
                            type: 'progress',
                            message: 'Planning comprehensive market analysis...',
                            step: 5,
                            totalSteps: 7,
                            timestamp: Date.now(),
                        });

                        // Step 6: Execute the tool calls
                        console.log(`Starting tool execution phase...`);
                        const gatheredData = await executeToolCalls(
                            client,
                            orchestrationText,
                            symbol
                        );

                        // Enhanced logging for monitoring
                        console.log(`Tool execution summary:`);
                        console.log(
                            `   Total tools attempted: ${
                                gatheredData.toolResults?.length || 0
                            }`
                        );
                        const successfulTools =
                            gatheredData.toolResults?.filter((r: any) => !r.error) || [];
                        const failedTools =
                            gatheredData.toolResults?.filter((r: any) => r.error) || [];
                        console.log(`   Successful tools: ${successfulTools.length}`);
                        console.log(`   Failed tools: ${failedTools.length}`);

                        if (successfulTools.length > 0) {
                            console.log(
                                `   Working tools: ${successfulTools
                                    .map((t: any) => t.tool)
                                    .join(', ')}`
                            );
                            successfulTools.forEach((tool: any) => {
                                const responseLength = tool.raw_response?.length || 0;
                                console.log(
                                    `      ${tool.tool}: ${responseLength} chars of data`
                                );
                            });
                        }

                        if (failedTools.length > 0) {
                            console.log(
                                `   Failed tools: ${failedTools
                                    .map((t: any) => `${t.tool} (${t.error})`)
                                    .join(', ')}`
                            );
                        }

                        send({
                            type: 'progress',
                            message: `Market data gathered from ${successfulTools.length} sources...`,
                            step: 6,
                            totalSteps: 7,
                            toolsUsed: successfulTools.length,
                            toolsFailed: failedTools.length,
                            timestamp: Date.now(),
                        });

                        // Step 7: Let Gemini analyze the gathered data with enhanced prompts
                        const analysisPrompt = createEnhancedAnalysisPrompt(
                            symbol,
                            gatheredData
                        );
                        console.log('Generating comprehensive analysis...');
                        console.log(
                            `Analysis prompt length: ${analysisPrompt.length} characters`
                        );
                        console.log(
                            `Data being analyzed: ${
                                JSON.stringify(gatheredData).length
                            } characters`
                        );

                        const analysisResult = await model.generateContent(analysisPrompt);
                        const analysisText = analysisResult.response.text();
                        console.log(
                            `Gemini analysis response length: ${analysisText.length} characters`
                        );

                        // Step 8: Parse and return the analysis
                        const analysisData = parseAnalysisResponse(
                            analysisText,
                            symbol,
                            gatheredData
                        );

                        // Enhanced logging for final results
                        console.log(`Final analysis summary:`);
                        console.log(`   Symbol: ${analysisData.symbol}`);
                        console.log(`   Recommendation: ${analysisData.recommendation}`);
                        console.log(`   Confidence: ${analysisData.confidence}%`);
                        console.log(`   Sentiment: ${analysisData.social_sentiment}`);
                        console.log(
                            `   Key metrics count: ${
                                Object.keys(analysisData.key_metrics).length
                            }`
                        );
                        console.log(
                            `   Chart data points: ${analysisData.chart_data.length}`
                        );
                        console.log(
                            `   Analysis pros: ${analysisData.ai_analysis.pros.length}`
                        );
                        console.log(
                            `   Analysis cons: ${analysisData.ai_analysis.cons.length}`
                        );

                        const completenessScore =
                            calculateAnalysisCompleteness(analysisData);
                        console.log(`   Analysis completeness: ${completenessScore}%`);

                        send({
                            type: 'complete',
                            data: analysisData,
                            metadata: {
                                query,
                                cryptocurrency: symbol,
                                fullName,
                                analysisTime: new Date().toISOString(),
                                hasMarketData: !!gatheredData.toolResults?.length,
                                toolsUsed: successfulTools.length,
                                toolsFailed: failedTools.length,
                                dataCompleteness: completenessScore,
                                totalProcessingTime: Date.now(),
                            },
                            message: 'Analysis complete!',
                            speak:
                                "That's my comprehensive analysis! Let me know if you need more details.",
                            timestamp: Date.now(),
                        });

                        controller.close();
                    } catch (error) {
                        console.error('Streaming analysis error:', error);
                        sendError(
                            error instanceof Error ? error.message : 'Analysis failed'
                        );
                    } finally {
                        // Clean up MCP client connection
                        if (client) {
                            try {
                                await client.close();
                                console.log('MCP client connection closed');
                            } catch (cleanupError) {
                                console.warn(
                                    'Warning: MCP client cleanup failed:',
                                    cleanupError
                                );
                            }
                        }
                    }
                };

                // Start the analysis process
                processAnalysis();
            },
        });

        return new Response(stream, {
            headers: {
                'Content-Type': 'application/json',
                'Cache-Control': 'no-cache',
                'Access-Control-Allow-Origin': '*',
                'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
                'Access-Control-Allow-Headers': 'Content-Type, Authorization',
            },
        });
    } catch (error) {
        console.error('Streaming endpoint error:', error);
        return new Response(
            `{"type": "error", "message": "Analysis failed", "error": "${
                error instanceof Error ? error.message : 'Unknown error'
            }", "timestamp": ${Date.now()}}\n`,
            {
                status: 500,
                headers: {
                    'Content-Type': 'application/json',
                    'Cache-Control': 'no-cache',
                    'Access-Control-Allow-Origin': '*',
                    'Access-Control-Allow-Methods': 'GET, POST, PUT, DELETE, OPTIONS',
                    'Access-Control-Allow-Headers': 'Content-Type, Authorization',
                },
            }
        );
    }
}

// Create MCP client using the exact same method as the working repo
async function createMCPClient(apiKey: string): Promise<Client> {
    console.log('Initializing MCP client with official SDK...');

    // Create SSE transport for LunarCrush MCP server
    const transport = new SSEClientTransport(
        new URL(`https://lunarcrush.ai/sse?key=${apiKey}`)
    );

    // Create MCP client
    const client = new Client(
        {
            name: 'voice-crypto-assistant',
            version: '1.0.0',
        },
        {
            capabilities: {
                tools: {},
            },
        }
    );

    // Connect to the server
    await client.connect(transport);
    console.log('MCP client connected successfully');

    return client;
}

function createEnhancedOrchestrationPrompt(
    symbol: string,
    fullName: string,
    availableTools: any[]
): string {
    return `
You are a cryptocurrency analyst with access to powerful LunarCrush MCP tools. Your task is to analyze ${fullName} (${symbol}) using the most appropriate tools available.

AVAILABLE MCP TOOLS:
${JSON.stringify(availableTools, null, 2)}

ANALYSIS TARGET: ${fullName} (${symbol})

ENHANCED INSTRUCTIONS:
1. **Study the tool schemas carefully** - Each tool has specific input requirements and data types
2. **Choose 3-5 complementary tools** that will provide comprehensive analysis coverage
3. **Use proper parameters** - Follow the exact schema requirements for each tool
4. **Try flexible topic matching** - Use "${symbol}", "${fullName}", or keyword variations as needed
5. **Prioritize working tools** - Choose tools most likely to have data for this specific cryptocurrency

TOOL SELECTION STRATEGY:
Choose tools that cover these essential areas:
- **Market Data & Performance**: Price, volume, market cap, rankings, historical performance
- **Social Intelligence**: Mentions, sentiment, engagement, community activity, social dominance
- **Market Analysis**: Competitive positioning, market trends, correlation analysis
- **Risk & Technical Metrics**: Volatility, technical indicators, market health scores

PARAMETER OPTIMIZATION GUIDELINES:
- **topic/symbol parameters**: Try "${symbol}" first, then "${fullName}", then keyword variants
- **Date parameters**: Use recent dates like "2025-06-24" to "2025-07-01" for current data
- **Required vs Optional**: Include ALL required parameters, add optionals only if clearly beneficial
- **Data types**: Match schema exactly - strings in quotes, numbers without quotes, arrays as arrays
- **Enum values**: Use exact enum options from schema (check carefully for valid options)

OUTPUT FORMAT:
Respond with a JSON array of tool calls:

[
  {
    "tool": "exact_tool_name_from_schema",
    "args": {
      "topic": "${symbol}",
      "required_param": "exact_value_matching_schema"
    },
    "reason": "Why this tool is essential for comprehensive ${fullName} analysis",
    "expected_data": "What specific key data this will provide (price, sentiment, metrics, etc.)",
    "link_format": "Expected REST endpoint format for this tool"
  }
]

CRITICAL SUCCESS REQUIREMENTS:
- **Match schemas exactly** - Wrong parameter names/types cause immediate failures
- **Focus on high-value tools** - Choose tools most likely to return meaningful data for ${symbol}
- **Quality over quantity** - 3-5 well-chosen tools beats many random/duplicate ones
- **Real parameter values** - Use actual dates, proper symbol formats, valid enum options
- **No redundancy** - Avoid multiple tools that return essentially the same data type

TOOL SELECTION PRIORITIES:
1. **Primary Market Data** - Choose 1-2 tools for core price/volume/ranking data
2. **Social Intelligence** - Choose 1-2 tools for sentiment/community/engagement metrics
3. **Analytical Tools** - Choose 1-2 tools for deeper analysis/comparisons/insights
4. **Avoid** - Skip tools with complex/unusual parameter requirements unless essential

Select the optimal tool set for comprehensive ${fullName} (${symbol}) analysis now.`;
}

async function executeToolCalls(
    client: Client,
    orchestrationText: string,
    symbol: string
): Promise<any> {
    try {
        // Extract JSON array from response
        const jsonMatch = orchestrationText.match(/\[[\s\S]*\]/);
        if (!jsonMatch) {
            console.log('No JSON array found, using fallback');
            return {
                symbol: symbol.toUpperCase(),
                toolResults: [],
                error: 'No tool calls found in response',
            };
        }

        const toolCalls: ToolCall[] = JSON.parse(jsonMatch[0]);
        const gatheredData: any = {
            symbol: symbol.toUpperCase(),
            toolResults: [],
        };

        // Execute tool calls concurrently
        const toolPromises = toolCalls.map(async (toolCall: ToolCall) => {
            try {
                console.log(`Executing: ${toolCall.tool}`);
                console.log(`   Reason: ${toolCall.reason}`);
                console.log(
                    `   Expected Data: ${toolCall.expected_data || 'Not specified'}`
                );
                console.log(
                    `   Link Format: ${toolCall.link_format || 'Not specified'}`
                );
                console.log(`   Arguments: ${JSON.stringify(toolCall.args)}`);

                const result = await client.callTool({
                    name: toolCall.tool,
                    arguments: toolCall.args,
                });

                console.log(`   ${toolCall.tool} completed successfully`);

                // Extract the raw text response from MCP with proper type checking
                const rawResponse =
                    Array.isArray(result.content) &&
                    result.content.length > 0 &&
                    result.content[0]?.text
                        ? result.content[0].text
                        : 'No response';
                console.log(`   Raw response type: ${typeof rawResponse}`);
                console.log(
                    `   Raw response preview: ${rawResponse.substring(0, 200)}...`
                );

                return {
                    tool: toolCall.tool,
                    args: toolCall.args,
                    reason: toolCall.reason,
                    expected_data: toolCall.expected_data,
                    link_format: toolCall.link_format,
                    raw_response: rawResponse, // Keep the raw text for Gemini to parse
                    result,
                };
            } catch (error) {
                console.error(`Tool ${toolCall.tool} failed:`, error);
                console.error(`   Was trying to: ${toolCall.reason}`);
                console.error(`   With arguments: ${JSON.stringify(toolCall.args)}`);
                return {
                    tool: toolCall.tool,
                    args: toolCall.args,
                    reason: toolCall.reason,
                    expected_data: toolCall.expected_data,
                    link_format: toolCall.link_format,
                    raw_response: null,
                    error: error instanceof Error ? error.message : 'Unknown error',
                };
            }
        });

        gatheredData.toolResults = await Promise.all(toolPromises);
        return gatheredData;
    } catch (error) {
        console.error('Error executing tool choices:', error);
        return {
            symbol: symbol.toUpperCase(),
            toolResults: [],
            error: error instanceof Error ? error.message : 'Unknown error',
        };
    }
}

function createEnhancedAnalysisPrompt(
    symbol: string,
    gatheredData: any
): string {
    return `
You are an expert cryptocurrency analyst with deep market knowledge. You have gathered comprehensive data from LunarCrush MCP tools for ${symbol.toUpperCase()}, but the tools returned raw text/markdown responses instead of structured JSON. Your job is to intelligently parse this raw data and provide a professional trading recommendation.

RAW DATA FROM MCP TOOLS:
${JSON.stringify(gatheredData, null, 2)}

ENHANCED PARSING INSTRUCTIONS:
1. **Parse the raw_response fields thoroughly** - These contain the actual valuable data in text/markdown format
2. **Extract all numeric values** - Look for prices, volumes, scores, percentages, rankings, etc.
3. **Identify sentiment indicators** - Words like "bullish", "bearish", "positive", "negative", "growing", "declining"
4. **Handle various text formats** - Data may be in markdown tables, plain text, or structured formats
5. **Focus on successful tools** - Prioritize data from tools that didn't error out
6. **Cross-reference data sources** - Look for consistency across multiple tool responses

COMPREHENSIVE ANALYSIS FRAMEWORK:
Parse the raw responses to extract and synthesize:

1. **MARKET FUNDAMENTALS & PERFORMANCE**
   - Current price action, daily/weekly performance trends
   - Volume patterns, liquidity analysis, and market participation
   - Market capitalization, ranking, and relative market positioning
   - Historical performance vs Bitcoin, Ethereum, and top altcoins

2. **SOCIAL INTELLIGENCE & COMMUNITY HEALTH**
   - Social sentiment trends, community engagement levels
   - Social mentions velocity, quality, and reach metrics
   - Influencer activity, developer activity, community growth patterns
   - Social dominance score and market buzz analysis

3. **TECHNICAL & MARKET POSITIONING**
   - Market ranking position and competitive landscape analysis
   - Price trend analysis, momentum indicators, volatility metrics
   - Support/resistance levels, technical pattern recognition
   - Market cycle positioning and correlation with broader crypto markets

4. **RISK ASSESSMENT & INVESTMENT PERSPECTIVE**
   - Volatility analysis, downside risk evaluation
   - Regulatory considerations and fundamental project risks
   - Market cycle timing and macroeconomic factors
   - Portfolio allocation recommendations and risk management

CRITICAL DATA EXTRACTION REQUIREMENTS:
- **Parse numeric values precisely** - Extract actual prices, volumes, percentages from text
- **Identify trending patterns** - Look for growth/decline indicators in the data
- **Extract sentiment signals** - Find qualitative assessments within the text responses
- **Map data to metrics** - Convert text descriptions to quantifiable key_metrics
- **If chart/time series data exists**, convert it to proper JSON format
- **Acknowledge data limitations** - If insufficient data, state it clearly rather than guessing

ENHANCED OUTPUT REQUIREMENTS:
- **Educational insights** that explain what the parsed data means for traders/investors
- **Specific reasoning** based on actual extracted data points, not generic crypto advice
- **Realistic confidence levels** based on data quality and completeness
- **Actionable recommendations** with clear risk considerations

Respond with a JSON object in this exact format:
{
  "recommendation": "BUY|SELL|HOLD",
  "confidence": 0-100,
  "reasoning": "Clear explanation based on the ACTUAL parsed data from tool responses - cite specific metrics",
  "social_sentiment": "bullish|bearish|neutral",
  "key_metrics": {
    "price": "extracted_price_or_0",
    "galaxy_score": "extracted_score_or_N/A",
    "alt_rank": "extracted_rank_or_N/A",
    "social_dominance": "extracted_dominance_or_N/A",
    "market_cap": "extracted_cap_or_0",
    "volume_24h": "extracted_volume_or_0",
    "mentions": "extracted_mentions_or_N/A",
    "engagements": "extracted_engagements_or_N/A",
    "creators": "extracted_creators_or_N/A",
    "sentiment_score": "extracted_sentiment_or_N/A",
    "price_change_24h": "extracted_change_or_0"
  },
  "ai_analysis": {
    "summary": "1-2 sentence overview based on ACTUAL parsed data with specific metrics cited",
    "pros": ["Positive factor 1 from parsed data", "Positive factor 2 from parsed data", "Positive factor 3 if available"],
    "cons": ["Risk factor 1 from parsed data", "Risk factor 2 from parsed data", "Risk factor 3 if available"],
    "key_factors": ["Critical factor 1 from data", "Critical factor 2 from data", "Critical factor 3 if available"]
  },
  "chart_data": [{"date": "YYYY-MM-DD", "price": actual_parsed_price}],
  "marketData": {
    "price": actual_parsed_price_or_0,
    "change24h": actual_parsed_change_or_0,
    "volume": actual_parsed_volume_or_0,
    "marketCap": actual_parsed_cap_or_0,
    "available": true_if_real_data_was_parsed_false_otherwise
  }
}

CRITICAL SUCCESS REQUIREMENTS:
- **Parse and extract real values** from the text responses - don't use placeholder values
- **If chart/time series data is in text**, convert it to the required JSON format
- **Focus on educational insights** that explain what the parsed data means for trading decisions
- **Ensure all numeric values are actual numbers**, not strings (except when marked as strings)
- **Maintain JSON format integrity** - no trailing commas or syntax errors
- **Cite specific data** - Reference actual numbers/metrics found in the tool responses
- **Be realistic about confidence** - Lower confidence if data is limited or conflicting

Remember: The MCP tools returned TEXT/MARKDOWN, not JSON. Your job is to be an intelligent parser and extract valuable insights from these text responses while providing professional trading analysis.`;
}

function parseAnalysisResponse(
    responseText: string,
    symbol: string,
    gatheredData: any
): TradingAnalysis {
    try {
        console.log('Gemini raw response:', responseText);

        // Extract JSON from response
        const jsonMatch = responseText.match(/\{[\s\S]*\}/);
        if (!jsonMatch) {
            throw new Error('No JSON found in Gemini response');
        }

        let jsonText = jsonMatch[0];

        // Handle truncated JSON by trying to fix common issues
        if (!jsonText.endsWith('}')) {
            const lastCompleteField = jsonText.lastIndexOf('"}');
            if (lastCompleteField > 0) {
                jsonText = jsonText.substring(0, lastCompleteField + 2) + '}';
            }
        }

        const analysis = JSON.parse(jsonText);

        // Validate and format response
        return {
            symbol: symbol.toUpperCase(),
            recommendation: analysis.recommendation || 'HOLD',
            confidence: analysis.confidence || 50,
            reasoning: analysis.reasoning || 'Analysis completed',
            social_sentiment: analysis.social_sentiment || 'neutral',
            key_metrics: analysis.key_metrics || {},
            ai_analysis: analysis.ai_analysis || {
                summary: 'Analysis completed',
                pros: [],
                cons: [],
                key_factors: [],
            },
            timestamp: new Date().toISOString(),
            chart_data: transformChartData(analysis.chart_data || []),
            success: true,
        };
    } catch (error) {
        console.error('Error parsing Gemini response:', error);

        // Fallback response
        return {
            symbol: symbol.toUpperCase(),
            recommendation: 'HOLD',
            confidence: 50,
            reasoning: 'Analysis completed with limited data',
            social_sentiment: 'neutral',
            key_metrics: gatheredData || {},
            ai_analysis: {
                summary: 'Unable to complete full AI analysis. Please try again.',
                pros: [],
                cons: ['Analysis parsing failed'],
                key_factors: [],
            },
            timestamp: new Date().toISOString(),
            chart_data: [],
            success: true,
        };
    }
}

function transformChartData(
    chartData: Array<{
        time?: string;
        date?: string;
        close?: number;
        price?: number;
    }>
): Array<{ date: string; price: number }> {
    if (!Array.isArray(chartData) || chartData.length === 0) {
        return [];
    }

    // Transform and filter valid data points
    const transformedData = chartData
        .map((item) => ({
            date: item.time || item.date || '',
            price: item.close || item.price || 0,
        }))
        .filter((item) => item.date && item.price > 0)
        .sort((a, b) => new Date(a.date).getTime() - new Date(b.date).getTime());

    return transformedData;
}

// Calculate analysis completeness score for monitoring
function calculateAnalysisCompleteness(analysis: TradingAnalysis): number {
    const checks = [
        !!analysis.symbol,
        !!analysis.recommendation,
        typeof analysis.confidence === 'number' && analysis.confidence > 0,
        !!analysis.reasoning && analysis.reasoning.length > 20,
        !!analysis.social_sentiment,
        !!analysis.key_metrics && Object.keys(analysis.key_metrics).length > 0,
        !!analysis.ai_analysis &&
            !!analysis.ai_analysis.summary &&
            analysis.ai_analysis.summary.length > 10,
        !!analysis.ai_analysis &&
            Array.isArray(analysis.ai_analysis.pros) &&
            analysis.ai_analysis.pros.length > 0,
        !!analysis.ai_analysis &&
            Array.isArray(analysis.ai_analysis.cons) &&
            analysis.ai_analysis.cons.length > 0,
        !!analysis.ai_analysis &&
            Array.isArray(analysis.ai_analysis.key_factors) &&
            analysis.ai_analysis.key_factors.length > 0,
        !!analysis.chart_data && Array.isArray(analysis.chart_data),
        !!analysis.timestamp,
        analysis.success === true,
    ];

    return Math.round((checks.filter(Boolean).length / checks.length) * 100);
}

// Intelligent crypto extraction using Gemini AI
async function extractCryptoFromQuery(
    query: string,
    geminiKey: string
): Promise<{ symbol: string; fullName: string }> {
    try {
        const genAI = new GoogleGenerativeAI(geminiKey);
        const model = genAI.getGenerativeModel({ model: 'gemini-2.0-flash-lite' });

        const extractionPrompt = `
You are a cryptocurrency expert. Analyze the following user query and extract the primary cryptocurrency they want to analyze.

User Query: "${query}"

INSTRUCTIONS:
1. Identify any cryptocurrency mentioned (by name, symbol, or nickname)
2. If multiple cryptocurrencies are mentioned, prioritize the main one being discussed
3. If no specific cryptocurrency is mentioned, suggest Bitcoin as default
4. Handle common variations and be precise with popular cryptocurrencies

IMPORTANT: Your response MUST be in this EXACT format - two lines only:
SYMBOL: [TICKER_SYMBOL]
NAME: [FULL_CRYPTOCURRENCY_NAME]

Examples:
- "What's the sentiment for Bitcoin?" β†’ SYMBOL: BTC\nNAME: Bitcoin
- "How is ETH performing?" β†’ SYMBOL: ETH\nNAME: Ethereum
- "Tell me about Solana price" β†’ SYMBOL: SOL\nNAME: Solana
- "Analyze DOGE social metrics" β†’ SYMBOL: DOGE\nNAME: Dogecoin
- "BTC price analysis" β†’ SYMBOL: BTC\nNAME: Bitcoin
- "What's happening with crypto markets?" β†’ SYMBOL: BTC\nNAME: Bitcoin

Respond with exactly two lines in the format above - no other text.`;

        const result = await model.generateContent(extractionPrompt);
        const responseText = result.response.text().trim();

        console.log(`Gemini extraction response: ${responseText}`);

        // Parse the structured response
        const lines = responseText.split('\n');
        let symbol = 'BTC';
        let fullName = 'Bitcoin';

        for (const line of lines) {
            if (line.startsWith('SYMBOL:')) {
                symbol = line.replace('SYMBOL:', '').trim().toUpperCase();
            } else if (line.startsWith('NAME:')) {
                fullName = line.replace('NAME:', '').trim();
            }
        }

        // Validate we got a reasonable symbol (2-10 uppercase letters)
        if (!/^[A-Z]{2,10}$/.test(symbol)) {
            console.warn(`Invalid symbol "${symbol}", defaulting to BTC`);
            symbol = 'BTC';
            fullName = 'Bitcoin';
        }

        console.log(`Extracted: ${symbol} (${fullName})`);
        return { symbol, fullName };
    } catch (error) {
        console.error('Error extracting crypto from query:', error);
        return {
            symbol: 'BTC',
            fullName: 'Bitcoin',
        }; // Safe fallback
    }
}
EOF
Enter fullscreen mode Exit fullscreen mode

Voice Recognition & Output Hooks (8 minutes)

Create Voice Recognition Hook

# Create voice recognition custom hook
cat > src/hooks/useVoiceRecognition.ts << 'EOF'
'use client';

import { useState, useEffect, useCallback, } from 'react';
import SpeechRecognition, { useSpeechRecognition } from 'react-speech-recognition';

interface UseVoiceRecognitionReturn {
  transcript: string;
  isListening: boolean;
  isMicrophoneAvailable: boolean;
  startListening: () => void;
  stopListening: () => void;
  resetTranscript: () => void;
  error: string | null;
}

export function useVoiceRecognition(): UseVoiceRecognitionReturn {
  const [error, setError] = useState<string | null>(null);

  const {
    transcript,
    listening,
    resetTranscript,
    browserSupportsSpeechRecognition,
    isMicrophoneAvailable
  } = useSpeechRecognition();

  useEffect(() => {
    if (!browserSupportsSpeechRecognition) {
      setError('Browser does not support speech recognition. Please use Chrome, Safari, or Edge.');
    } else if (!isMicrophoneAvailable) {
      setError('Microphone is not available. Please check your permissions.');
    } else {
      setError(null);
    }
  }, [browserSupportsSpeechRecognition, isMicrophoneAvailable]);

  const startListening = useCallback(() => {
    setError(null);

    if (!browserSupportsSpeechRecognition) {
      setError('Speech recognition is not supported in this browser');
      return;
    }

    SpeechRecognition.startListening({
      continuous: true, // Keep listening for continuous speech
      language: 'en-US',
      interimResults: true, // Show interim results
    });
  }, [browserSupportsSpeechRecognition]);

  const stopListening = useCallback(() => {
    SpeechRecognition.stopListening();
  }, []);

  return {
    transcript,
    isListening: listening,
    isMicrophoneAvailable: isMicrophoneAvailable && browserSupportsSpeechRecognition,
    startListening,
    stopListening,
    resetTranscript,
    error
  };
}
EOF
Enter fullscreen mode Exit fullscreen mode

Create Voice Output Hook

# Create voice output custom hook with advanced controls
cat > src/hooks/useVoiceOutput.ts << 'EOF'
'use client';

import { useState, useRef, useCallback, useEffect } from 'react';

interface UseVoiceOutputReturn {
  isSpeaking: boolean;
  isPaused: boolean;
  speak: (text: string) => Promise<void>;
  pause: () => void;
  resume: () => void;
  stop: () => void;
  setRate: (rate: number) => void;
  setVolume: (volume: number) => void;
  setVoice: (voice: SpeechSynthesisVoice | null) => void;
  currentRate: number;
  currentVolume: number;
  currentVoice: SpeechSynthesisVoice | null;
  availableVoices: SpeechSynthesisVoice[];
  error: string | null;
}

export function useVoiceOutput(): UseVoiceOutputReturn {
  const [isSpeaking, setIsSpeaking] = useState(false);
  const [isPaused, setIsPaused] = useState(false);
  const [currentRate, setCurrentRate] = useState(1);
  const [currentVolume, setCurrentVolume] = useState(1);
  const [currentVoice, setCurrentVoice] = useState<SpeechSynthesisVoice | null>(null);
  const [availableVoices, setAvailableVoices] = useState<SpeechSynthesisVoice[]>([]);
  const [error, setError] = useState<string | null>(null);

  const utteranceRef = useRef<SpeechSynthesisUtterance | null>(null);
  const isIntentionalStop = useRef(false);

  // Load available voices
  useEffect(() => {
    const loadVoices = () => {
      const voices = speechSynthesis.getVoices();
      setAvailableVoices(voices);

      // Auto-select a good default voice if none selected
      if (!currentVoice && voices.length > 0) {
        // Try to find a good English voice
        const preferredVoice = voices.find(voice => 
          voice.lang.startsWith('en') && (
            voice.name.includes('Google') || 
            voice.name.includes('Microsoft') ||
            voice.name.includes('Samantha') || // macOS
            voice.name.includes('Alex') || // macOS
            voice.name.includes('Natural') ||
            voice.name.includes('Neural')
          )
        ) || voices.find(voice => voice.lang.startsWith('en')) || voices[0];

        setCurrentVoice(preferredVoice);
      }
    };

    // Load voices immediately
    loadVoices();

    // Some browsers load voices asynchronously
    speechSynthesis.addEventListener('voiceschanged', loadVoices);

    return () => {
      speechSynthesis.removeEventListener('voiceschanged', loadVoices);
    };
  }, [currentVoice]);

  const speak = useCallback(async (text: string) => {
    return new Promise<void>((resolve, reject) => {
      try {
        // Stop any current speech
        if (utteranceRef.current) {
          isIntentionalStop.current = true;
          speechSynthesis.cancel();
        }

        setError(null);
        setIsSpeaking(true);
        setIsPaused(false);
        isIntentionalStop.current = false;

        const utterance = new SpeechSynthesisUtterance(text);
        utteranceRef.current = utterance;

        // Set voice properties
        utterance.rate = currentRate;
        utterance.volume = currentVolume;
        utterance.pitch = 1;

        // Use selected voice
        if (currentVoice) {
          utterance.voice = currentVoice;
        }

        utterance.onstart = () => {
          setIsSpeaking(true);
          setIsPaused(false);
        };

        utterance.onend = () => {
          setIsSpeaking(false);
          setIsPaused(false);
          utteranceRef.current = null;

          // Only resolve if it wasn't an intentional stop
          if (!isIntentionalStop.current) {
            resolve();
          }
        };

        utterance.onerror = (event) => {
          setIsSpeaking(false);
          setIsPaused(false);
          utteranceRef.current = null;

          // Only show error if it wasn't an intentional interruption
          if (!isIntentionalStop.current && event.error !== 'interrupted') {
            setError(`Speech synthesis error: ${event.error}`);
            reject(new Error(`Speech synthesis error: ${event.error}`));
          } else {
            // For intentional stops or interruptions, just resolve
            resolve();
          }
        };

        utterance.onpause = () => {
          setIsPaused(true);
        };

        utterance.onresume = () => {
          setIsPaused(false);
        };

        speechSynthesis.speak(utterance);

      } catch (err) {
        setError(err instanceof Error ? err.message : 'Failed to synthesize speech');
        setIsSpeaking(false);
        setIsPaused(false);
        reject(err);
      }
    });
  }, [currentRate, currentVolume, currentVoice]);

  const pause = useCallback(() => {
    if (utteranceRef.current && isSpeaking) {
      speechSynthesis.pause();
    }
  }, [isSpeaking]);

  const resume = useCallback(() => {
    if (utteranceRef.current && isPaused) {
      speechSynthesis.resume();
    }
  }, [isPaused]);

  const stop = useCallback(() => {
    if (utteranceRef.current) {
      isIntentionalStop.current = true;
      speechSynthesis.cancel();
      setIsSpeaking(false);
      setIsPaused(false);
      utteranceRef.current = null;
      setError(null); // Clear any errors when intentionally stopping
    }
  }, []);

  const setRate = useCallback((rate: number) => {
    setCurrentRate(rate);
    // If currently speaking, stop and restart with new rate
    if (utteranceRef.current && isSpeaking) {
      const currentText = utteranceRef.current.text;
      stop();
      // Small delay to ensure clean stop
      setTimeout(() => {
        speak(currentText);
      }, 100);
    }
  }, [isSpeaking, speak, stop]);

  const setVolume = useCallback((volume: number) => {
    setCurrentVolume(volume);
    // Volume can be changed in real-time
    if (utteranceRef.current) {
      utteranceRef.current.volume = volume;
    }
  }, []);

  const setVoice = useCallback((voice: SpeechSynthesisVoice | null) => {
    setCurrentVoice(voice);
    // If currently speaking, stop and restart with new voice
    if (utteranceRef.current && isSpeaking) {
      const currentText = utteranceRef.current.text;
      stop();
      // Small delay to ensure clean stop
      setTimeout(() => {
        speak(currentText);
      }, 100);
    }
  }, [isSpeaking, speak, stop]);

  return {
    isSpeaking,
    isPaused,
    speak,
    pause,
    resume,
    stop,
    setRate,
    setVolume,
    setVoice,
    currentRate,
    currentVolume,
    currentVoice,
    availableVoices,
    error
  };
}
EOF
Enter fullscreen mode Exit fullscreen mode

UI Components (10 minutes)

Create Analysis Progress Component

# Create animated progress component
cat > src/components/AnalysisProgress.tsx << 'EOF'
'use client';

import { useState, useEffect } from 'react';
import {
    Box,
    Paper,
    Typography,
    LinearProgress,
    Stack,
    Chip,
    alpha,
} from '@mui/material';
import {
    Search,
    Psychology,
    Analytics,
    TrendingUp,
    CheckCircle,
} from '@mui/icons-material';

interface AnalysisProgressProps {
    isAnalyzing: boolean;
    progressMessage?: string;
    onComplete?: () => void;
}

export function AnalysisProgress({
    isAnalyzing,
    progressMessage,
    onComplete,
}: AnalysisProgressProps) {
    const [currentStep, setCurrentStep] = useState(0);
    const [progress, setProgress] = useState(0);
    const [subMessage, setSubMessage] = useState('');

    const steps = [
        {
            icon: <Search />,
            title: 'Detecting Cryptocurrency',
            description: 'Identifying the cryptocurrency from your query...',
            duration: 2000,
            subMessages: [
                'Parsing natural language query...',
                'Matching cryptocurrency symbols...',
                'Validating cryptocurrency detection...',
            ],
        },
        {
            icon: <Analytics />,
            title: 'Connecting to LunarCrush MCP',
            description: 'Fetching real-time social sentiment data...',
            duration: 8000,
            subMessages: [
                'Connecting to LunarCrush MCP server...',
                'Requesting social metrics...',
                'Fetching engagement data...',
                'Analyzing social dominance...',
                'Processing creator activity...',
                'Gathering market data...',
                'Collecting mention statistics...',
            ],
        },
        {
            icon: <Psychology />,
            title: 'AI Analysis with Google Gemini',
            description: 'Processing data with advanced AI...',
            duration: 25000,
            subMessages: [
                'Sending data to Google Gemini...',
                'Analyzing market sentiment patterns...',
                'Processing social engagement metrics...',
                'Evaluating price action trends...',
                'Cross-referencing technical indicators...',
                'Assessing institutional activity...',
                'Calculating confidence levels...',
                'Generating investment recommendations...',
                'Synthesizing comprehensive analysis...',
                'Preparing natural language response...',
            ],
        },
        {
            icon: <TrendingUp />,
            title: 'Finalizing Results',
            description: 'Preparing your comprehensive analysis...',
            duration: 3000,
            subMessages: [
                'Formatting analysis results...',
                'Preparing voice synthesis...',
                'Validating data consistency...',
            ],
        },
    ];

    useEffect(() => {
        if (!isAnalyzing) {
            setCurrentStep(0);
            setProgress(0);
            setSubMessage('');
            return;
        }

        let stepTimer: NodeJS.Timeout;
        let progressTimer: NodeJS.Timeout;
        let subMessageTimer: NodeJS.Timeout;

        const runStep = (stepIndex: number) => {
            if (stepIndex >= steps.length) {
                setProgress(100);
                if (onComplete) onComplete();
                return;
            }

            const step = steps[stepIndex];
            setCurrentStep(stepIndex);

            // Animate progress for this step
            const stepProgress = (stepIndex / steps.length) * 100;
            const nextStepProgress = ((stepIndex + 1) / steps.length) * 100;

            let currentProgress = stepProgress;
            progressTimer = setInterval(() => {
                currentProgress +=
                    (nextStepProgress - stepProgress) / (step.duration / 100);
                if (currentProgress >= nextStepProgress) {
                    currentProgress = nextStepProgress;
                    clearInterval(progressTimer);
                }
                setProgress(currentProgress);
            }, 100);

            // Cycle through sub-messages
            let subMessageIndex = 0;
            const showSubMessage = () => {
                if (subMessageIndex < step.subMessages.length) {
                    setSubMessage(step.subMessages[subMessageIndex]);
                    subMessageIndex++;
                    subMessageTimer = setTimeout(
                        showSubMessage,
                        step.duration / step.subMessages.length
                    );
                }
            };
            showSubMessage();

            // Move to next step
            stepTimer = setTimeout(() => {
                clearInterval(progressTimer);
                clearTimeout(subMessageTimer);
                runStep(stepIndex + 1);
            }, step.duration);
        };

        runStep(0);

        return () => {
            clearTimeout(stepTimer);
            clearInterval(progressTimer);
            clearTimeout(subMessageTimer);
        };
    }, [isAnalyzing]);

    if (!isAnalyzing) return null;

    return (
        <Paper
            elevation={2}
            sx={{
                p: 4,
                borderRadius: 3,
                background: 'linear-gradient(135deg, #1A1A1A 0%, #2A2A2A 100%)',
                border: '1px solid #333',
            }}>
            <Box sx={{ mb: 3 }}>
                <Typography
                    variant='h6'
                    sx={{
                        mb: 2,
                        color: 'white',
                        display: 'flex',
                        alignItems: 'center',
                        gap: 1,
                    }}>
                    <Psychology sx={{ color: '#00C896' }} />
                    Analyzing Cryptocurrency Data
                </Typography>

                <LinearProgress
                    variant='determinate'
                    value={progress}
                    sx={{
                        height: 8,
                        borderRadius: 4,
                        backgroundColor: alpha('#00C896', 0.2),
                        '& .MuiLinearProgress-bar': {
                            backgroundColor: '#00C896',
                            borderRadius: 4,
                        },
                    }}
                />

                <Box sx={{ display: 'flex', justifyContent: 'space-between', mt: 1 }}>
                    <Typography variant='caption' sx={{ color: 'rgba(255,255,255,0.7)' }}>
                        {Math.round(progress)}% Complete
                    </Typography>
                    <Typography variant='caption' sx={{ color: 'rgba(255,255,255,0.7)' }}>
                        ~{Math.max(0, Math.round((100 - progress) * 0.4))}s remaining
                    </Typography>
                </Box>
            </Box>

            <Stack spacing={2}>
                {steps.map((step, index) => (
                    <Box
                        key={index}
                        sx={{
                            display: 'flex',
                            alignItems: 'center',
                            gap: 2,
                            p: 2,
                            borderRadius: 2,
                            backgroundColor:
                                index === currentStep
                                    ? alpha('#00C896', 0.1)
                                    : index < currentStep
                                    ? alpha('#00C896', 0.05)
                                    : 'transparent',
                            border:
                                index === currentStep
                                    ? '1px solid #00C896'
                                    : '1px solid transparent',
                            transition: 'all 0.3s ease',
                        }}>
                        <Box
                            sx={{
                                color:
                                    index === currentStep
                                        ? '#00C896'
                                        : index < currentStep
                                        ? '#00C896'
                                        : 'rgba(255,255,255,0.5)',
                                transition: 'color 0.3s ease',
                            }}>
                            {index < currentStep ? <CheckCircle /> : step.icon}
                        </Box>

                        <Box sx={{ flex: 1 }}>
                            <Typography
                                variant='body1'
                                sx={{
                                    color:
                                        index <= currentStep ? 'white' : 'rgba(255,255,255,0.5)',
                                    fontWeight: index === currentStep ? 600 : 400,
                                }}>
                                {step.title}
                            </Typography>

                            {index === currentStep && (
                                <Typography
                                    variant='body2'
                                    sx={{
                                        color: 'rgba(255,255,255,0.8)',
                                        fontStyle: 'italic',
                                        mt: 0.5,
                                    }}>
                                    {progressMessage || subMessage || step.description}
                                </Typography>
                            )}
                        </Box>

                        {index === currentStep && (
                            <Chip
                                label='Processing'
                                size='small'
                                sx={{
                                    backgroundColor: alpha('#00C896', 0.2),
                                    color: '#00C896',
                                    animation: 'pulse 2s infinite',
                                }}
                            />
                        )}

                        {index < currentStep && (
                            <Chip
                                label='Complete'
                                size='small'
                                sx={{
                                    backgroundColor: alpha('#00C896', 0.2),
                                    color: '#00C896',
                                }}
                            />
                        )}
                    </Box>
                ))}
            </Stack>
        </Paper>
    );
}
EOF
Enter fullscreen mode Exit fullscreen mode

Create Analysis Results Component

# Create comprehensive results display component
cat > src/components/AnalysisResults.tsx << 'EOF'
'use client';

import {
    Box,
    Paper,
    Typography,
    Chip,
    Grid,
    Card,
    Stack,
    alpha,
} from '@mui/material';
import {
    Psychology,
    TrendingUp,
    TrendingDown,
    Remove,
    AttachMoney,
    ShowChart,
    People,
    Visibility,
    Speed,
    AccountBalanceWallet,
} from '@mui/icons-material';
import {
    formatCurrency,
    formatPercentage,
    formatCount,
} from '@/lib/formatters';

interface AnalysisResultsProps {
    data: {
        success: boolean;
        recommendation: 'BUY' | 'SELL' | 'HOLD';
        confidence: number;
        reasoning: string;
        social_sentiment: 'bullish' | 'bearish' | 'neutral';
        key_metrics: {
            price: string;
            galaxy_score: string;
            alt_rank: string;
            social_dominance: string;
            market_cap: string;
            volume_24h: string;
            mentions: string;
            engagements: string;
            creators: string;
        };
        ai_analysis:
            | {
                    summary: string;
                    pros: string[];
                    cons: string[];
                    key_factors: string[];
              }
            | string;
        miscellaneous: string;
        spokenResponse: string;
        symbol: string;
        toolsUsed: number;
        dataPoints: number;
        responseTime: number;
        crypto_detection: {
            detected_crypto: string;
            symbol: string;
            confidence: number;
            reasoning: string;
        };
    };
}

export function AnalysisResults({ data }: AnalysisResultsProps) {
    const getRecommendationColor = (recommendation: string) => {
        switch (recommendation) {
            case 'BUY':
                return { bg: '#00C896', text: 'white' };
            case 'SELL':
                return { bg: '#FF6B6B', text: 'white' };
            case 'HOLD':
                return { bg: '#FFB800', text: 'white' };
            default:
                return { bg: '#666', text: 'white' };
        }
    };

    const getSentimentIcon = (sentiment: string) => {
        switch (sentiment) {
            case 'bullish':
                return <TrendingUp sx={{ color: '#00C896' }} />;
            case 'bearish':
                return <TrendingDown sx={{ color: '#FF6B6B' }} />;
            default:
                return <Remove sx={{ color: '#B3B3B3' }} />;
        }
    };

    const getSentimentColor = (sentiment: string) => {
        switch (sentiment) {
            case 'bullish':
                return '#00C896';
            case 'bearish':
                return '#FF6B6B';
            default:
                return '#B3B3B3';
        }
    };

    const recColor = getRecommendationColor(data.recommendation);

    // Format all the metrics using our utility functions for better readability
    const formattedMetrics = {
        price: `$${data.key_metrics.price}`,
        marketCap: formatCurrency(data.key_metrics.market_cap),
        volume24h: formatCurrency(data.key_metrics.volume_24h),
        galaxyScore: data.key_metrics.galaxy_score,
        socialDominance: formatPercentage(data.key_metrics.social_dominance),
        mentions: formatCount(data.key_metrics.mentions),
        engagements: formatCount(data.key_metrics.engagements),
        creators: formatCount(data.key_metrics.creators),
        altRank: data.key_metrics.alt_rank,
    };

    return (
        <Box sx={{ mt: 4 }}>
            {/* Header Section */}
            <Paper elevation={1} sx={{ p: 4, borderRadius: 3, mb: 3 }}>
                <Box sx={{ display: 'flex', alignItems: 'center', gap: 2, mb: 3 }}>
                    <Psychology sx={{ color: 'primary.main', fontSize: 32 }} />
                    <Box>
                        <Typography variant='h4' sx={{ fontWeight: 700 }}>
                            {data.symbol}
                        </Typography>
                    </Box>
                </Box>

                {/* Key Decision Metrics */}
                <Grid container spacing={3} sx={{ mt: 2 }}>
                    <Grid xs={12} md={4}>
                        <Card
                            elevation={0}
                            sx={{
                                bgcolor: alpha(recColor.bg, 0.1),
                                border: `2px solid ${recColor.bg}`,
                                p: 3,
                                textAlign: 'center',
                                height: 140,
                                display: 'flex',
                                flexDirection: 'column',
                                justifyContent: 'center',
                                borderRadius: 2,
                            }}>
                            <Typography
                                variant='subtitle1'
                                color='text.secondary'
                                gutterBottom>
                                Recommendation
                            </Typography>
                            <Chip
                                label={data.recommendation}
                                sx={{
                                    bgcolor: recColor.bg,
                                    color: recColor.text,
                                    fontSize: '1.1rem',
                                    fontWeight: 700,
                                    px: 2,
                                    py: 1,
                                    mx: 'auto',
                                }}
                            />
                        </Card>
                    </Grid>

                    <Grid xs={12} md={4}>
                        <Card
                            elevation={0}
                            sx={{
                                bgcolor: alpha('#4285F4', 0.1),
                                border: '2px solid #4285F4',
                                p: 3,
                                textAlign: 'center',
                                height: 140,
                                display: 'flex',
                                flexDirection: 'column',
                                justifyContent: 'center',
                                borderRadius: 2,
                            }}>
                            <Typography
                                variant='subtitle1'
                                color='text.secondary'
                                gutterBottom>
                                Confidence Level
                            </Typography>
                            <Typography
                                variant='h3'
                                sx={{ color: '#4285F4', fontWeight: 700 }}>
                                {data.confidence}%
                            </Typography>
                        </Card>
                    </Grid>

                    <Grid xs={12} md={4}>
                        <Card
                            elevation={0}
                            sx={{
                                bgcolor: alpha(getSentimentColor(data.social_sentiment), 0.1),
                                border: `2px solid ${getSentimentColor(data.social_sentiment)}`,
                                p: 3,
                                textAlign: 'center',
                                height: 140,
                                display: 'flex',
                                flexDirection: 'column',
                                justifyContent: 'center',
                                borderRadius: 2,
                            }}>
                            <Typography
                                variant='subtitle1'
                                color='text.secondary'
                                gutterBottom>
                                Social Sentiment
                            </Typography>
                            <Box
                                sx={{
                                    display: 'flex',
                                    alignItems: 'center',
                                    justifyContent: 'center',
                                    gap: 1,
                                }}>
                                {getSentimentIcon(data.social_sentiment)}
                                <Typography
                                    variant='h5'
                                    sx={{ textTransform: 'capitalize', fontWeight: 600 }}>
                                    {data.social_sentiment}
                                </Typography>
                            </Box>
                        </Card>
                    </Grid>
                </Grid>
            </Paper>

            {/* Key Metrics Section with Formatted Numbers */}
            <Paper elevation={1} sx={{ p: 4, borderRadius: 3, mb: 3 }}>
                <Typography
                    variant='h5'
                    gutterBottom
                    sx={{ display: 'flex', alignItems: 'center', gap: 1, mb: 3 }}>
                    <ShowChart sx={{ color: 'primary.main' }} />
                    Key Market & Social Metrics
                </Typography>

                <Grid container spacing={3}>
                    <Grid xs={12} sm={6} md={3}>
                        <Card elevation={0} sx={{ bgcolor: 'background.default', p: 2 }}>
                            <Stack
                                direction='row'
                                alignItems='center'
                                spacing={1}
                                sx={{ mb: 1 }}>
                                <AttachMoney sx={{ color: '#00C896' }} />
                                <Typography variant='body2' color='text.secondary'>
                                    Current Price
                                </Typography>
                            </Stack>
                            <Typography variant='h6' sx={{ fontWeight: 600 }}>
                                {formattedMetrics.price}
                            </Typography>
                        </Card>
                    </Grid>

                    <Grid xs={12} sm={6} md={3}>
                        <Card elevation={0} sx={{ bgcolor: 'background.default', p: 2 }}>
                            <Stack
                                direction='row'
                                alignItems='center'
                                spacing={1}
                                sx={{ mb: 1 }}>
                                <AccountBalanceWallet sx={{ color: '#4285F4' }} />
                                <Typography variant='body2' color='text.secondary'>
                                    Market Cap
                                </Typography>
                            </Stack>
                            <Typography variant='h6' sx={{ fontWeight: 600 }}>
                                {formattedMetrics.marketCap}
                            </Typography>
                        </Card>
                    </Grid>

                    <Grid xs={12} sm={6} md={3}>
                        <Card elevation={0} sx={{ bgcolor: 'background.default', p: 2 }}>
                            <Stack
                                direction='row'
                                alignItems='center'
                                spacing={1}
                                sx={{ mb: 1 }}>
                                <ShowChart sx={{ color: '#FFB800' }} />
                                <Typography variant='body2' color='text.secondary'>
                                    24h Volume
                                </Typography>
                            </Stack>
                            <Typography variant='h6' sx={{ fontWeight: 600 }}>
                                {formattedMetrics.volume24h}
                            </Typography>
                        </Card>
                    </Grid>

                    <Grid xs={12} sm={6} md={3}>
                        <Card elevation={0} sx={{ bgcolor: 'background.default', p: 2 }}>
                            <Stack
                                direction='row'
                                alignItems='center'
                                spacing={1}
                                sx={{ mb: 1 }}>
                                <Speed sx={{ color: '#FF6B6B' }} />
                                <Typography variant='body2' color='text.secondary'>
                                    Galaxy Score
                                </Typography>
                            </Stack>
                            <Typography variant='h6' sx={{ fontWeight: 600 }}>
                                {formattedMetrics.galaxyScore}
                            </Typography>
                        </Card>
                    </Grid>

                    <Grid xs={12} sm={6} md={3}>
                        <Card elevation={0} sx={{ bgcolor: 'background.default', p: 2 }}>
                            <Stack
                                direction='row'
                                alignItems='center'
                                spacing={1}
                                sx={{ mb: 1 }}>
                                <Visibility sx={{ color: '#9C27B0' }} />
                                <Typography variant='body2' color='text.secondary'>
                                    Social Dominance
                                </Typography>
                            </Stack>
                            <Typography variant='h6' sx={{ fontWeight: 600 }}>
                                {formattedMetrics.socialDominance}
                            </Typography>
                        </Card>
                    </Grid>

                    <Grid xs={12} sm={6} md={3}>
                        <Card elevation={0} sx={{ bgcolor: 'background.default', p: 2 }}>
                            <Stack
                                direction='row'
                                alignItems='center'
                                spacing={1}
                                sx={{ mb: 1 }}>
                                <People sx={{ color: '#00BCD4' }} />
                                <Typography variant='body2' color='text.secondary'>
                                    Mentions (24h)
                                </Typography>
                            </Stack>
                            <Typography variant='h6' sx={{ fontWeight: 600 }}>
                                {formattedMetrics.mentions}
                            </Typography>
                        </Card>
                    </Grid>

                    <Grid xs={12} sm={6} md={3}>
                        <Card elevation={0} sx={{ bgcolor: 'background.default', p: 2 }}>
                            <Stack
                                direction='row'
                                alignItems='center'
                                spacing={1}
                                sx={{ mb: 1 }}>
                                <TrendingUp sx={{ color: '#4CAF50' }} />
                                <Typography variant='body2' color='text.secondary'>
                                    Engagements
                                </Typography>
                            </Stack>
                            <Typography variant='h6' sx={{ fontWeight: 600 }}>
                                {formattedMetrics.engagements}
                            </Typography>
                        </Card>
                    </Grid>

                    <Grid xs={12} sm={6} md={3}>
                        <Card elevation={0} sx={{ bgcolor: 'background.default', p: 2 }}>
                            <Stack
                                direction='row'
                                alignItems='center'
                                spacing={1}
                                sx={{ mb: 1 }}>
                                <People sx={{ color: '#FF9800' }} />
                                <Typography variant='body2' color='text.secondary'>
                                    Active Creators
                                </Typography>
                            </Stack>
                            <Typography variant='h6' sx={{ fontWeight: 600 }}>
                                {formattedMetrics.creators}
                            </Typography>
                        </Card>
                    </Grid>
                </Grid>
            </Paper>

            {/* Analysis Sections */}
            <Grid container sx={{ justifyContent: 'space-between' }}>
                <Grid xs={12} md={7}>
                    <Paper
                        elevation={1}
                        sx={{ p: 4, borderRadius: 3, mb: 3, height: '100%' }}>
                        <Typography
                            variant='h6'
                            gutterBottom
                            sx={{ color: '#00C896', fontWeight: 600 }}>
                            🧠 AI Analysis Summary
                        </Typography>
                        {typeof data.ai_analysis === 'object' ? (
                            <Stack spacing={2}>
                                <Typography
                                    variant='body1'
                                    sx={{ lineHeight: 1.7, fontWeight: 500 }}>
                                    {data.ai_analysis.summary}
                                </Typography>

                                {data.ai_analysis.pros.length > 0 && (
                                    <Box>
                                        <Typography
                                            variant='subtitle2'
                                            sx={{ color: '#00C896', fontWeight: 600, mb: 1 }}>
                                            Positive Factors:
                                        </Typography>
                                        <Box component='ul' sx={{ pl: 2, m: 0 }}>
                                            {data.ai_analysis.pros.map((pro, index) => (
                                                <Typography
                                                    key={index}
                                                    component='li'
                                                    variant='body2'
                                                    sx={{ mb: 0.5 }}>
                                                    {pro}
                                                </Typography>
                                            ))}
                                        </Box>
                                    </Box>
                                )}

                                {data.ai_analysis.cons.length > 0 && (
                                    <Box>
                                        <Typography
                                            variant='subtitle2'
                                            sx={{ color: '#FF6B6B', fontWeight: 600, mb: 1 }}>
                                            Risk Factors:
                                        </Typography>
                                        <Box component='ul' sx={{ pl: 2, m: 0 }}>
                                            {data.ai_analysis.cons.map((con, index) => (
                                                <Typography
                                                    key={index}
                                                    component='li'
                                                    variant='body2'
                                                    sx={{ mb: 0.5 }}>
                                                    {con}
                                                </Typography>
                                            ))}
                                        </Box>
                                    </Box>
                                )}

                                {data.ai_analysis.key_factors.length > 0 && (
                                    <Box>
                                        <Typography
                                            variant='subtitle2'
                                            sx={{ color: '#4285F4', fontWeight: 600, mb: 1 }}>
                                            Key Factors to Monitor:
                                        </Typography>
                                        <Box component='ul' sx={{ pl: 2, m: 0 }}>
                                            {data.ai_analysis.key_factors.map((factor, index) => (
                                                <Typography
                                                    key={index}
                                                    component='li'
                                                    variant='body2'
                                                    sx={{ mb: 0.5 }}>
                                                    {factor}
                                                </Typography>
                                            ))}
                                        </Box>
                                    </Box>
                                )}
                            </Stack>
                        ) : (
                            <Typography variant='body1' sx={{ lineHeight: 1.7 }}>
                                {data.ai_analysis}
                            </Typography>
                        )}
                    </Paper>
                </Grid>

                <Grid xs={12} md={4.75}>
                    <Paper elevation={1} sx={{ p: 4, borderRadius: 3 }}>
                        <Typography
                            variant='h6'
                            gutterBottom
                            sx={{ color: '#4285F4', fontWeight: 600 }}>
                            πŸ“Š Detailed Reasoning
                        </Typography>
                        <Typography variant='body1' sx={{ lineHeight: 1.7 }}>
                            {data.reasoning}
                        </Typography>
                    </Paper>
                </Grid>
            </Grid>

            {/* Detection Info */}
        </Box>
    );
}
EOF
Enter fullscreen mode Exit fullscreen mode

Main Application Setup (5 minutes)

Create Material-UI Theme and Layout

# Create theme configuration
cat > src/app/layout.tsx << 'EOF'
import 'regenerator-runtime/runtime';
import type { Metadata } from 'next';
import { ThemeProvider } from '@mui/material/styles';
import CssBaseline from '@mui/material/CssBaseline';
import { AppRouterCacheProvider } from '@mui/material-nextjs/v15-appRouter';
import theme from '@/lib/theme';
import './globals.css';

export const metadata: Metadata = {
    title: 'Voice Crypto Assistant',
    description: 'AI-powered cryptocurrency analysis with voice interface',
};

export default function RootLayout({
    children,
}: {
    children: React.ReactNode;
}) {
    return (
        <html lang='en'>
            <body>
                <AppRouterCacheProvider>
                    <ThemeProvider theme={theme}>
                        <CssBaseline />
                        {children}
                    </ThemeProvider>
                </AppRouterCacheProvider>
            </body>
        </html>
    );
}
EOF
Enter fullscreen mode Exit fullscreen mode

Create Hero Section Component

# Create dynamic hero section
cat > src/components/HeroSection.tsx << 'EOF'
'use client';

import { useState, useEffect } from 'react';
import {
    Box,
    Container,
    Typography,
    Button,
    Stack,
    Chip,
    alpha,
} from '@mui/material';
import {
    TrendingUp,
    Psychology,
    Mic,
    AutoAwesome,
    Analytics,
} from '@mui/icons-material';

interface HeroSectionProps {
    onStartVoiceInput?: () => void;
}

export function HeroSection({ onStartVoiceInput }: HeroSectionProps) {
    const [currentDemo, setCurrentDemo] = useState(0);
    const [demoQueries, setDemoQueries] = useState([
        "What's the crypto market sentiment?",
        'How are cryptocurrencies trending?',
        "Analyze today's crypto performance",
        'Tell me about market opportunities',
        'Show me social sentiment data',
    ]);

    // Load dynamic demo queries on component mount
    useEffect(() => {
        const loadDynamicQueries = async () => {
            try {
                const dynamicQueries = [
                    "What's the sentiment on the top crypto?",
                    'How are the markets trending today?',
                    "What's the most talked about coin?",
                    'Show me social sentiment analysis',
                    'Which cryptocurrencies are gaining momentum?',
                ];
                setDemoQueries(dynamicQueries);
            } catch (error) {
                console.warn('Failed to load dynamic queries, using fallbacks:', error);
                // Keep the fallback queries already set in state
            }
        };

        loadDynamicQueries();
    }, []);

    // Cycle through demo queries
    useEffect(() => {
        const interval = setInterval(() => {
            setCurrentDemo((prev) => (prev + 1) % demoQueries.length);
        }, 3000);
        return () => clearInterval(interval);
    }, []);

    const handleStartVoice = () => {
        // First scroll to the voice assistant section
        const voiceAssistant = document.getElementById('voice-assistant');
        if (voiceAssistant) {
            voiceAssistant.scrollIntoView({ behavior: 'smooth' });

            // Then trigger voice input after a short delay for smooth UX
            setTimeout(() => {
                if (onStartVoiceInput) {
                    onStartVoiceInput();
                }
            }, 500);
        }
    };

    return (
        <Box
            sx={{
                background:
                    'linear-gradient(135deg, #0A0A0A 0%, #1A1A1A 50%, #2A2A2A 100%)',
                color: 'white',
                py: { xs: 8, md: 12 },
                position: 'relative',
                overflow: 'hidden',
            }}>
            {/* Animated background elements */}
            <Box
                sx={{
                    position: 'absolute',
                    top: 0,
                    left: 0,
                    right: 0,
                    bottom: 0,
                    opacity: 0.1,
                    background: `
            radial-gradient(circle at 20% 20%, #00C896 0%, transparent 50%),
            radial-gradient(circle at 80% 80%, #FFD700 0%, transparent 50%),
            radial-gradient(circle at 40% 80%, #FF6B6B 0%, transparent 50%)
          `,
                }}
            />

            <Container maxWidth='lg' sx={{ position: 'relative', zIndex: 1 }}>
                <Box sx={{ textAlign: 'center' }}>
                    {/* Main Branding Header */}
                    <Typography
                        variant='h2'
                        component='h1'
                        sx={{
                            fontWeight: 800,
                            fontSize: { xs: '2.5rem', md: '4rem' },
                            mb: 2,
                            background:
                                'linear-gradient(135deg, #00C896 0%, #FFD700 50%, #FF6B6B 100%)',
                            backgroundClip: 'text',
                            WebkitBackgroundClip: 'text',
                            WebkitTextFillColor: 'transparent',
                            textAlign: 'center',
                        }}>
                        Crypto AI Voice Agent
                    </Typography>

                    {/* Powered by section */}
                    <Stack
                        direction={{ xs: 'column', sm: 'row' }}
                        spacing={1}
                        justifyContent='center'
                        alignItems='center'
                        sx={{ mb: 4 }}>
                        <Typography variant='h6' sx={{ color: 'rgba(255,255,255,0.8)' }}>
                            Powered by
                        </Typography>
                        <Stack
                            direction='row'
                            spacing={1}
                            flexWrap='wrap'
                            justifyContent='center'>
                            <Chip
                                icon={<Analytics />}
                                label='LunarCrush MCP'
                                sx={{
                                    bgcolor: alpha('#00C896', 0.2),
                                    color: '#00C896',
                                    borderColor: '#00C896',
                                    border: '1px solid',
                                    fontWeight: 600,
                                }}
                            />
                            <Chip
                                icon={<Psychology />}
                                label='Google Gemini'
                                sx={{
                                    bgcolor: alpha('#4285F4', 0.2),
                                    color: '#4285F4',
                                    borderColor: '#4285F4',
                                    border: '1px solid',
                                    fontWeight: 600,
                                }}
                            />
                        </Stack>
                    </Stack>

                    {/* Subtitle */}
                    <Typography
                        variant='h5'
                        sx={{
                            color: 'rgba(255,255,255,0.9)',
                            mb: 4,
                            maxWidth: '600px',
                            mx: 'auto',
                            lineHeight: 1.4,
                            fontWeight: 400,
                        }}>
                        Real-time cryptocurrency analysis through natural voice conversation
                    </Typography>

                    {/* Demo query display */}
                    <Box
                        sx={{
                            mb: 6,
                            p: 3,
                            borderRadius: 2,
                            background: 'rgba(255,255,255,0.05)',
                            border: '1px solid rgba(255,255,255,0.1)',
                            backdropFilter: 'blur(10px)',
                        }}>
                        <Typography
                            variant='body2'
                            sx={{
                                color: 'rgba(255,255,255,0.6)',
                                mb: 1,
                                textTransform: 'uppercase',
                                letterSpacing: 1,
                            }}>
                            Try saying:
                        </Typography>
                        <Typography
                            variant='h6'
                            sx={{
                                color: '#00C896',
                                fontStyle: 'italic',
                                transition: 'all 0.3s ease',
                                minHeight: '2rem',
                            }}>
                            "{demoQueries[currentDemo]}"
                        </Typography>
                    </Box>

                    {/* CTA Button - Now directly starts voice input */}
                    <Button
                        variant='contained'
                        size='large'
                        onClick={handleStartVoice}
                        startIcon={<Mic />}
                        sx={{
                            bgcolor: '#00C896',
                            color: 'white',
                            px: 4,
                            py: 2,
                            fontSize: '1.1rem',
                            fontWeight: 600,
                            borderRadius: 3,
                            boxShadow: '0 8px 32px rgba(0, 200, 150, 0.3)',
                            '&:hover': {
                                bgcolor: '#00B085',
                                transform: 'translateY(-2px)',
                                boxShadow: '0 12px 40px rgba(0, 200, 150, 0.4)',
                            },
                            transition: 'all 0.3s ease',
                        }}>
                        Start Voice Analysis
                    </Button>

                    {/* Feature highlights */}
                    <Stack
                        direction={{ xs: 'column', md: 'row' }}
                        spacing={4}
                        justifyContent='center'
                        sx={{ mt: 8 }}>
                        {[
                            { icon: <TrendingUp />, text: 'Real-time Social Sentiment' },
                            { icon: <Psychology />, text: 'AI-Powered Analysis' },
                            { icon: <AutoAwesome />, text: 'Natural Voice Interface' },
                        ].map((feature, index) => (
                            <Stack
                                key={index}
                                direction='row'
                                spacing={1}
                                alignItems='center'>
                                <Box sx={{ color: '#00C896' }}>{feature.icon}</Box>
                                <Typography
                                    variant='body2'
                                    sx={{ color: 'rgba(255,255,255,0.8)' }}>
                                    {feature.text}
                                </Typography>
                            </Stack>
                        ))}
                    </Stack>
                </Box>
            </Container>
        </Box>
    );
}
EOF
Enter fullscreen mode Exit fullscreen mode

Create Main Voice Assistant Component

# Create the main voice assistant component (simplified for tutorial)
cat > src/components/VoiceAssistant.tsx << 'EOF'
'use client';

import {
    useState,
    useEffect,
    useRef,
    useImperativeHandle,
    forwardRef,
} from 'react';
import {
    Box,
    Card,
    Typography,
    Button,
    TextField,
    Paper,
    Stack,
    IconButton,
    Slider,
    Collapse,
    Chip,
    Grow,
    alpha,
    Switch,
    FormControlLabel,
    Select,
    MenuItem,
    FormControl,
} from '@mui/material';
import {
    Mic,
    MicOff,
    Psychology,
    VolumeUp,
    VolumeOff,
    PlayArrow,
    Pause,
    Stop,
    Settings,
    Edit,
    Send,
    Cancel,
    ExpandMore,
    ExpandLess,
    RecordVoiceOver,
    VoiceChat,
} from '@mui/icons-material';

import { useVoiceRecognition } from '@/hooks/useVoiceRecognition';
import { useVoiceOutput } from '@/hooks/useVoiceOutput';
import { AnalysisProgress } from './AnalysisProgress';
import { AnalysisResults } from './AnalysisResults';

interface VoiceAssistantRef {
    startVoiceInput: () => void;
}

export const VoiceAssistant = forwardRef<VoiceAssistantRef>((_, ref) => {
    // State management
    const [mounted, setMounted] = useState(false);
    const [isProcessing, setIsProcessing] = useState(false);
    const [progressMessage, setProgressMessage] = useState<string>('');
    const [lastResponse, setLastResponse] = useState<string>('');
    const [analysisData, setAnalysisData] = useState<any>(null);
    const [queryHistory, setQueryHistory] = useState<string[]>([]);
    const [abortController, setAbortController] =
        useState<AbortController | null>(null);
    const [showAudioControls, setShowAudioControls] = useState(false);
    const [autoSpeak, setAutoSpeak] = useState(true);

    // Edit functionality state
    const [isEditing, setIsEditing] = useState(false);
    const [editedQuery, setEditedQuery] = useState('');

    // Voice hooks
    const {
        transcript,
        isListening,
        isMicrophoneAvailable,
        startListening,
        stopListening,
        resetTranscript,
        error: voiceError,
    } = useVoiceRecognition();

    const {
        isSpeaking,
        speak,
        pause,
        resume,
        stop: stopSpeaking,
        isPaused,
        setRate,
        setVolume,
        setVoice,
        currentRate,
        currentVolume,
        currentVoice,
        availableVoices,
        error: speechError,
    } = useVoiceOutput();

    // Refs
    const lastTranscriptRef = useRef<string>('');
    const silenceTimer = useRef<NodeJS.Timeout | null>(null);

    // Fix hydration error
    useEffect(() => {
        setMounted(true);
    }, []);

    // Expose voice input method to parent components (React 19 style)
    useImperativeHandle(ref, () => ({
        startVoiceInput: handleVoiceInput,
    }));

    // Auto-submit after 4 seconds of silence - ONLY if not editing
    useEffect(() => {
        if (transcript && transcript !== lastTranscriptRef.current && !isEditing) {
            lastTranscriptRef.current = transcript;

            // Clear existing timer
            if (silenceTimer.current) {
                clearTimeout(silenceTimer.current);
            }

            // IMMEDIATELY set edited query when transcript appears
            setEditedQuery(transcript);

            // Set new timer for auto-submit (only if not editing)
            const timer = setTimeout(() => {
                if (transcript.trim() && isListening && !isEditing) {
                    console.log('Auto-submitting after silence:', transcript);
                    stopListening();
                    processQuery(transcript);
                }
            }, 4000);

            silenceTimer.current = timer;
        }

        return () => {
            if (silenceTimer.current) {
                clearTimeout(silenceTimer.current);
            }
        };
    }, [transcript, isListening, isEditing]);

    // Keyboard shortcuts
    useEffect(() => {
        if (!mounted) return;

        const handleKeyDown = (event: KeyboardEvent) => {
            // Space bar to toggle voice input (when not editing)
            if (
                event.code === 'Space' &&
                !isEditing &&
                event.target === document.body
            ) {
                event.preventDefault();
                handleVoiceInput();
            }

            // Escape to cancel/stop everything
            if (event.code === 'Escape') {
                if (isListening) {
                    stopListening();
                    resetTranscript();
                    setEditedQuery('');
                } else if (isSpeaking) {
                    stopSpeaking();
                } else if (isProcessing) {
                    handleStopQuery();
                } else if (isEditing) {
                    handleCancelEdit();
                }
            }

            // Enter to submit (when editing)
            if (event.code === 'Enter' && isEditing) {
                event.preventDefault();
                handleSubmitEditedQuery();
            }
        };

        document.addEventListener('keydown', handleKeyDown);
        return () => document.removeEventListener('keydown', handleKeyDown);
    }, [mounted, isListening, isSpeaking, isEditing, isProcessing]);

    // Load settings from localStorage
    useEffect(() => {
        if (!mounted) return;

        try {
            const saved = localStorage.getItem('voiceAssistantHistory');
            if (saved) {
                setQueryHistory(JSON.parse(saved));
            }

            const savedAutoSpeak = localStorage.getItem('voiceAssistantAutoSpeak');
            if (savedAutoSpeak !== null) {
                setAutoSpeak(JSON.parse(savedAutoSpeak));
            }
        } catch (error) {
            console.error('Failed to load settings:', error);
        }
    }, [mounted]);

    // Save query history to localStorage
    const saveQueryHistory = (newHistory: string[]) => {
        try {
            setQueryHistory(newHistory);
            localStorage.setItem('voiceAssistantHistory', JSON.stringify(newHistory));
        } catch (error) {
            console.error('Failed to save query history:', error);
        }
    };

    // Save auto-speak setting
    const handleAutoSpeakChange = (
        event: React.ChangeEvent<HTMLInputElement>
    ) => {
        const newValue = event.target.checked;
        setAutoSpeak(newValue);
        try {
            localStorage.setItem('voiceAssistantAutoSpeak', JSON.stringify(newValue));
        } catch (error) {
            console.error('Failed to save auto-speak setting:', error);
        }
    };

    const handleVoiceInput = () => {
        if (!mounted) return;

        if (isListening) {
            stopListening();
        } else {
            resetTranscript();
            setIsEditing(false);
            setEditedQuery('');
            startListening();
        }
    };

    const handleStopQuery = () => {
        // Stop any pending auto-submit
        if (silenceTimer.current) {
            clearTimeout(silenceTimer.current);
            silenceTimer.current = null;
        }

        // Stop any in-progress API call
        if (abortController) {
            abortController.abort();
            setAbortController(null);
        }

        setIsProcessing(false);
    };

    // Edit functionality handlers
    const handleStartEdit = () => {
        console.log(
            'πŸ›‘ Starting edit mode - stopping auto-submit and any processing'
        );

        // IMMEDIATELY stop auto-submit timer
        if (silenceTimer.current) {
            clearTimeout(silenceTimer.current);
            silenceTimer.current = null;
        }

        // STOP any in-progress query processing
        if (abortController) {
            abortController.abort();
            setAbortController(null);
        }
        setIsProcessing(false);

        // Stop listening if currently listening
        if (isListening) {
            stopListening();
        }

        // Enter edit mode
        setIsEditing(true);
    };

    const handleCancelEdit = () => {
        setIsEditing(false);
        setEditedQuery(transcript || ''); // Reset to original transcript
    };

    const handleSubmitEditedQuery = () => {
        if (editedQuery.trim()) {
            console.log('πŸš€ Submitting EDITED query:', editedQuery);
            setIsEditing(false);
            processQuery(editedQuery.trim()); // Submit the EDITED query, not transcript
        }
    };

    const processQuery = async (query: string) => {
        console.log('πŸ“ Processing query:', query);
        setIsProcessing(true);
        setAnalysisData(null);
        setProgressMessage('Initializing analysis...');

        // Save to history
        const newHistory = [query, ...queryHistory.slice(0, 9)];
        saveQueryHistory(newHistory);

        try {
            const controller = new AbortController();
            setAbortController(controller);

            const response = await fetch('/api/analyze', {
                method: 'POST',
                headers: { 'Content-Type': 'application/json' },
                body: JSON.stringify({ query }),
                signal: controller.signal,
            });

            if (!response.ok) {
                throw new Error(`Analysis failed: ${response.status}`);
            }

            // Handle streaming response
            const reader = response.body?.getReader();
            if (!reader) {
                throw new Error('No reader available');
            }

            const decoder = new TextDecoder();
            let finalData = null;

            try {
                while (true) {
                    const { done, value } = await reader.read();

                    if (done) {
                        console.log('🏁 Stream completed');
                        break;
                    }

                    const chunk = decoder.decode(value, { stream: true });
                    const lines = chunk.split('\n').filter((line) => line.trim());

                    for (const line of lines) {
                        try {
                            const data = JSON.parse(line);

                            if (data.type === 'progress') {
                                setProgressMessage(data.message);
                                console.log(`πŸ“Š Progress: ${data.message}`);
                            } else if (data.type === 'complete') {
                                console.log('πŸ“Š Analysis Complete:', data.data);
                                finalData = data.data;
                                setAnalysisData(finalData);
                                setProgressMessage('Analysis complete!');

                                // Auto-speak the response if enabled
                                if (autoSpeak && data.speak) {
                                    console.log('πŸ”Š Auto-speaking response');
                                    try {
                                        await speak(data.speak);
                                    } catch (speechErr) {
                                        console.error('Speech error:', speechErr);
                                    }
                                }
                            } else if (data.type === 'error') {
                                throw new Error(data.message || 'Analysis failed');
                            }
                        } catch {
                            console.warn('⚠️ Could not parse line:', line);
                        }
                    }
                }
            } finally {
                reader.releaseLock();
            }

            if (!finalData) {
                throw new Error('No analysis data received');
            }
        } catch (error: any) {
            if (error.name === 'AbortError') {
                console.log('Query was cancelled');
                return;
            }

            console.error('Error processing query:', error);
            const errorMessage =
                'I apologize, but I encountered an error processing your request. Please try again.';
            setLastResponse(errorMessage);

            if (autoSpeak) {
                try {
                    await speak(errorMessage);
                } catch (speechErr) {
                    console.error('Speech error:', speechErr);
                }
            }
        } finally {
            setIsProcessing(false);
            setProgressMessage('');
            setAbortController(null);
        }
    };

    // Manual speak function for the "Speak Response" button
    const handleManualSpeak = () => {
        if (lastResponse && !isSpeaking) {
            speak(lastResponse);
        }
    };

    // Test voice function
    const handleTestVoice = () => {
        if (!isSpeaking) {
            speak(
                'Hello! This is a test of the selected voice. How does this sound?'
            );
        }
    };

    // Get voice display name
    const getVoiceDisplayName = (voice: SpeechSynthesisVoice) => {
        return `${voice.name} (${voice.lang})`;
    };

    // Prevent hydration mismatch by not rendering until mounted
    if (!mounted) {
        return (
            <Box sx={{ maxWidth: '1200px', mx: 'auto', p: { xs: 2, md: 3 } }}>
                <Card elevation={0} sx={{ p: { xs: 3, md: 4 }, textAlign: 'center' }}>
                    <Typography variant='h4'>Loading Voice Assistant...</Typography>
                </Card>
            </Box>
        );
    }

    return (
        <Box
            id='voice-assistant'
            sx={{ maxWidth: '1800px', mx: 'auto', p: { xs: 2, md: 3 } }}>
            {/* Main Control Panel */}
            <Card elevation={0} sx={{ mb: 4, p: { xs: 3, md: 5 } }}>
                <Box sx={{ textAlign: 'center', mb: 5 }}>
                    <Typography
                        variant='h3'
                        component='h2'
                        gutterBottom
                        sx={{
                            fontWeight: 700,
                            color: 'text.primary',
                            fontSize: { xs: '2rem', md: '2.5rem' },
                        }}>
                        AI Crypto Analysis
                    </Typography>
                    <Typography
                        variant='body1'
                        color='text.secondary'
                        sx={{
                            maxWidth: 600,
                            mx: 'auto',
                            fontSize: '1.1rem',
                            lineHeight: 1.6,
                        }}>
                        Speak naturally or type your question to get instant analysis
                    </Typography>
                </Box>

                {/* Voice Settings */}
                <Box sx={{ display: 'flex', justifyContent: 'center', mb: 4 }}>
                    <FormControlLabel
                        control={
                            <Switch
                                checked={autoSpeak}
                                onChange={handleAutoSpeakChange}
                                color='primary'
                            />
                        }
                        label='Auto-speak responses'
                        sx={{
                            '& .MuiFormControlLabel-label': {
                                fontSize: '0.9rem',
                                color: 'text.secondary',
                            },
                        }}
                    />
                </Box>

                {/* Voice Input Section */}
                <Box sx={{ display: 'flex', flexDirection: 'column', gap: 4 }}>
                    {/* Voice Button */}
                    <Box sx={{ textAlign: 'center' }}>
                        <Box
                            sx={{
                                display: 'flex',
                                alignItems: 'center',
                                justifyContent: 'center',
                                gap: 3,
                            }}>
                            <IconButton
                                onClick={handleVoiceInput}
                                disabled={
                                    (!isMicrophoneAvailable && !isListening) || isProcessing
                                }
                                sx={{
                                    width: { xs: 80, md: 100 },
                                    height: { xs: 80, md: 100 },
                                    background: isListening ? '#FF6B6B' : '#00C896',
                                    color: 'white',
                                    '&:hover': {
                                        background: isListening ? '#FF5252' : '#00B085',
                                        transform: 'scale(1.05)',
                                    },
                                    '&:disabled': {
                                        background: '#333',
                                        color: '#666',
                                    },
                                    transition: 'all 0.3s ease',
                                    boxShadow: isListening
                                        ? '0 0 30px rgba(255, 107, 107, 0.4)'
                                        : '0 0 30px rgba(0, 200, 150, 0.4)',
                                }}>
                                {isListening ? (
                                    <MicOff sx={{ fontSize: { xs: 32, md: 40 } }} />
                                ) : (
                                    <Mic sx={{ fontSize: { xs: 32, md: 40 } }} />
                                )}
                            </IconButton>

                            {/* Stop Processing Button */}
                            {isProcessing && (
                                <IconButton
                                    onClick={handleStopQuery}
                                    sx={{
                                        bgcolor: '#FF6B6B',
                                        color: 'white',
                                        '&:hover': { bgcolor: '#FF5252' },
                                    }}>
                                    <Stop />
                                </IconButton>
                            )}
                        </Box>

                        <Typography variant='body2' color='text.secondary' sx={{ mt: 2 }}>
                            {isListening
                                ? 'Listening... (Click to stop or wait for auto-submit)'
                                : 'Click to start voice input'}
                        </Typography>
                    </Box>

                    {/* Audio Controls - Always show when we have a response */}
                    <Box sx={{ display: 'flex', flexDirection: 'column', gap: 2 }}>
                        <Box
                            sx={{
                                display: 'flex',
                                alignItems: 'center',
                                justifyContent: 'center',
                                gap: 2,
                                flexWrap: 'wrap',
                            }}>
                            {/* Manual Speak Button */}
                            {lastResponse && !isSpeaking && (
                                <Button
                                    onClick={handleManualSpeak}
                                    startIcon={<RecordVoiceOver />}
                                    variant='contained'
                                    sx={{
                                        bgcolor: '#00C896',
                                        '&:hover': { bgcolor: '#00B085' },
                                    }}>
                                    Speak Response
                                </Button>
                            )}

                            {/* Playback Controls */}
                            {isSpeaking && (
                                <Stack direction='row' spacing={1}>
                                    <IconButton
                                        onClick={isPaused ? resume : pause}
                                        sx={{
                                            bgcolor: 'primary.main',
                                            color: 'white',
                                            '&:hover': { bgcolor: 'primary.dark' },
                                        }}>
                                        {isPaused ? <PlayArrow /> : <Pause />}
                                    </IconButton>
                                    <IconButton
                                        onClick={stopSpeaking}
                                        sx={{
                                            bgcolor: '#FF6B6B',
                                            color: 'white',
                                            '&:hover': { bgcolor: '#FF5252' },
                                        }}>
                                        <Stop />
                                    </IconButton>
                                </Stack>
                            )}

                            {/* Audio Settings Toggle */}
                            <Button
                                onClick={() => setShowAudioControls(!showAudioControls)}
                                startIcon={<Settings />}
                                endIcon={showAudioControls ? <ExpandLess /> : <ExpandMore />}
                                variant='outlined'
                                size='small'>
                                Audio Settings
                            </Button>
                        </Box>

                        {/* Voice Status */}
                        {isSpeaking && (
                            <Box sx={{ textAlign: 'center' }}>
                                <Chip
                                    icon={<VolumeUp />}
                                    label={isPaused ? 'Paused' : 'Speaking...'}
                                    color='primary'
                                    sx={{ animation: isPaused ? 'none' : 'pulse 2s infinite' }}
                                />
                            </Box>
                        )}

                        {/* Current Voice Display */}
                        {currentVoice && (
                            <Box sx={{ textAlign: 'center' }}>
                                <Chip
                                    icon={<VoiceChat />}
                                    label={`Voice: ${currentVoice.name}`}
                                    variant='outlined'
                                    size='small'
                                />
                            </Box>
                        )}

                        {/* Expanded Audio Controls */}
                        <Collapse in={showAudioControls}>
                            <Paper
                                elevation={1}
                                sx={{ p: 3, borderRadius: 2, bgcolor: 'background.paper' }}>
                                {/* Voice Selection */}
                                <Box sx={{ mb: 3 }}>
                                    <Typography
                                        variant='body2'
                                        gutterBottom
                                        sx={{ color: 'text.secondary', fontWeight: 600 }}>
                                        Voice Selection
                                    </Typography>
                                    <Stack
                                        direction={{ xs: 'column', sm: 'row' }}
                                        spacing={2}
                                        alignItems={{ sm: 'center' }}>
                                        <FormControl fullWidth sx={{ minWidth: 200 }}>
                                            <Select
                                                value={currentVoice?.name || ''}
                                                onChange={(e) => {
                                                    const selectedVoice = availableVoices.find(
                                                        (v) => v.name === e.target.value
                                                    );
                                                    setVoice(selectedVoice || null);
                                                }}
                                                displayEmpty
                                                size='small'>
                                                <MenuItem value=''>
                                                    <em>Default Voice</em>
                                                </MenuItem>
                                                {availableVoices
                                                    .filter((v) => v.lang.includes('US'))
                                                    .map((voice) => (
                                                        <MenuItem key={voice.name} value={voice.name}>
                                                            {getVoiceDisplayName(voice)}
                                                        </MenuItem>
                                                    ))}
                                            </Select>
                                        </FormControl>
                                        <Button
                                            onClick={handleTestVoice}
                                            variant='outlined'
                                            size='small'
                                            startIcon={<VoiceChat />}
                                            disabled={isSpeaking}>
                                            Test Voice
                                        </Button>
                                    </Stack>
                                </Box>

                                {/* Speed Control */}
                                <Box sx={{ mb: 3 }}>
                                    <Typography
                                        variant='body2'
                                        gutterBottom
                                        sx={{ color: 'text.secondary', fontWeight: 600 }}>
                                        Speech Speed: {currentRate}Γ—
                                    </Typography>
                                    <Stack direction='row' spacing={1} flexWrap='wrap'>
                                        {[0.5, 0.75, 1, 1.25, 1.5, 2].map((option) => (
                                            <Button
                                                key={option}
                                                variant={
                                                    currentRate === option ? 'contained' : 'outlined'
                                                }
                                                onClick={() => setRate(option)}
                                                size='small'
                                                sx={{ minWidth: '60px' }}>
                                                {option}Γ—
                                            </Button>
                                        ))}
                                    </Stack>
                                </Box>

                                {/* Volume Control */}
                                <Box>
                                    <Typography
                                        variant='body2'
                                        gutterBottom
                                        sx={{ color: 'text.secondary', fontWeight: 600 }}>
                                        Volume: {Math.round(currentVolume * 100)}%
                                    </Typography>
                                    <Box sx={{ display: 'flex', alignItems: 'center', gap: 2 }}>
                                        <IconButton onClick={() => setVolume(0)} size='small'>
                                            <VolumeOff />
                                        </IconButton>
                                        <Slider
                                            value={currentVolume}
                                            onChange={(_, value) => setVolume(value as number)}
                                            min={0}
                                            max={1}
                                            step={0.1}
                                            sx={{
                                                flex: 1,
                                                '& .MuiSlider-thumb': {
                                                    '&:hover': {
                                                        boxShadow:
                                                            '0px 0px 0px 8px rgba(63, 81, 181, 0.16)',
                                                    },
                                                },
                                            }}
                                        />
                                        <IconButton onClick={() => setVolume(1)} size='small'>
                                            <VolumeUp />
                                        </IconButton>
                                    </Box>
                                </Box>
                            </Paper>
                        </Collapse>
                    </Box>

                    {/* Transcript Display with IMMEDIATE Edit Options */}
                    <Grow in={!!(transcript || isEditing || editedQuery)}>
                        <Box>
                            {(transcript || isEditing || editedQuery) && (
                                <Paper
                                    elevation={0}
                                    sx={{
                                        p: 3,
                                        borderRadius: 2,
                                        bgcolor: '#1A1A1A',
                                        border: '1px solid #2A2A2A',
                                    }}>
                                    {isEditing ? (
                                        <Box
                                            sx={{ display: 'flex', flexDirection: 'column', gap: 2 }}>
                                            <Typography
                                                variant='body2'
                                                sx={{ color: 'text.secondary' }}>
                                                Edit your query:
                                            </Typography>
                                            <TextField
                                                fullWidth
                                                multiline
                                                rows={2}
                                                value={editedQuery}
                                                onChange={(e) => setEditedQuery(e.target.value)}
                                                placeholder='Type your cryptocurrency question...'
                                                variant='outlined'
                                                autoFocus
                                            />
                                            <Box sx={{ display: 'flex', gap: 2 }}>
                                                <Button
                                                    variant='contained'
                                                    onClick={handleSubmitEditedQuery}
                                                    startIcon={<Send />}
                                                    sx={{ flex: 1 }}
                                                    disabled={!editedQuery.trim()}>
                                                    Analyze
                                                </Button>
                                                <Button
                                                    variant='outlined'
                                                    onClick={handleCancelEdit}
                                                    startIcon={<Cancel />}>
                                                    Cancel
                                                </Button>
                                            </Box>
                                        </Box>
                                    ) : (
                                        <Box>
                                            <Typography
                                                variant='body2'
                                                sx={{ color: 'text.secondary', mb: 1 }}>
                                                {isListening ? 'Listening...' : 'Voice Input:'}
                                            </Typography>
                                            <Typography
                                                variant='body1'
                                                sx={{ mb: 2, fontStyle: 'italic' }}>
                                                "{editedQuery || transcript}"
                                            </Typography>

                                            {/* IMMEDIATE Edit and Submit Options - Show as soon as transcript appears */}
                                            {transcript && (
                                                <Box sx={{ display: 'flex', gap: 2, flexWrap: 'wrap' }}>
                                                    <Button
                                                        variant='outlined'
                                                        onClick={handleStartEdit}
                                                        startIcon={<Edit />}
                                                        size='small'
                                                        sx={{
                                                            borderColor: '#FFB800',
                                                            color: '#FFB800',
                                                            '&:hover': {
                                                                borderColor: '#FF9800',
                                                                bgcolor: alpha('#FFB800', 0.1),
                                                            },
                                                        }}>
                                                        Edit Query
                                                    </Button>
                                                    {!isListening && (
                                                        <Button
                                                            variant='contained'
                                                            onClick={() => processQuery(transcript)}
                                                            startIcon={<Psychology />}
                                                            size='small'
                                                            disabled={isProcessing}>
                                                            Analyze As-Is
                                                        </Button>
                                                    )}
                                                </Box>
                                            )}

                                            {/* Auto-submit countdown - only show when listening */}
                                            {isListening && transcript && (
                                                <Typography
                                                    variant='caption'
                                                    sx={{
                                                        color: 'warning.main',
                                                        mt: 1,
                                                        display: 'block',
                                                    }}>
                                                    Auto-submit in 4 seconds... (or click "Edit Query" to
                                                    modify)
                                                </Typography>
                                            )}
                                        </Box>
                                    )}
                                </Paper>
                            )}
                        </Box>
                    </Grow>

                    {/* Usage Instructions */}
                    <Box sx={{ textAlign: 'center', mt: 2 }}>
                        <Typography variant='body2' color='text.secondary'>
                            πŸ’‘ Try: "What's Bitcoin's sentiment?" β€’ "Should I buy Ethereum?" β€’
                            "How is Solana trending?"
                        </Typography>
                        <Typography
                            variant='caption'
                            color='text.secondary'
                            sx={{ display: 'block', mt: 1 }}>
                            Keyboard shortcuts: Space (voice), Esc (cancel), Enter (when
                            editing)
                        </Typography>
                    </Box>
                </Box>
            </Card>

            {/* Enhanced Progress Loading */}
            <AnalysisProgress
                isAnalyzing={isProcessing}
                progressMessage={progressMessage}
            />

            {/* Enhanced Analysis Results */}
            {analysisData && <AnalysisResults data={analysisData} />}

            {/* Error Display */}
            {(voiceError || speechError) && (
                <Paper
                    elevation={0}
                    sx={{
                        p: 2,
                        bgcolor: 'error.light',
                        color: 'error.contrastText',
                        mt: 2,
                    }}>
                    <Typography variant='body2'>{voiceError || speechError}</Typography>
                </Paper>
            )}
        </Box>
    );
});

VoiceAssistant.displayName = 'VoiceAssistant';
EOF
Enter fullscreen mode Exit fullscreen mode

Create Main Page and Global Styles

# Create main page
cat > src/app/page.tsx << 'EOF'
'use client';

import 'regenerator-runtime/runtime';
import { useRef } from 'react';
import { Container, Box } from '@mui/material';
import { VoiceAssistant } from '@/components/VoiceAssistant';
import { HeroSection } from '@/components/HeroSection';
import { Footer } from '@/components/Footer';

export default function Home() {
  const voiceAssistantRef = useRef<{ startVoiceInput: () => void }>(null);

  const handleStartVoiceFromHero = () => {
    // Trigger voice input from the VoiceAssistant component
    if (voiceAssistantRef.current) {
      voiceAssistantRef.current.startVoiceInput();
    }
  };

  return (
    <Box sx={{ minHeight: '100vh', bgcolor: 'background.default' }}>
      {/* Hero Section with voice trigger */}
      <HeroSection onStartVoiceInput={handleStartVoiceFromHero} />

      {/* Main Application */}
      <Container maxWidth="xl" sx={{ py: 4 }}>
        <VoiceAssistant ref={voiceAssistantRef} />
      </Container>

      {/* Footer */}
      <Footer />
    </Box>
  );
}
EOF

# Create global styles
cat > src/app/globals.css << 'EOF'
/* Import Inter font */
@import url('https://fonts.googleapis.com/css2?family=Inter:wght@300;400;500;600;700;800&display=swap');

/* Reset and base styles */
* {
    margin: 0;
    padding: 0;
    box-sizing: border-box;
}

html {
    font-family: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI',
        sans-serif;
    scroll-behavior: smooth;
}

body {
    font-family: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI',
        sans-serif;
    line-height: 1.6;
    color: #ffffff;
    background-color: #0b0b0b;
    min-height: 100vh;
}

/* Custom scrollbar */
::-webkit-scrollbar {
    width: 8px;
}

::-webkit-scrollbar-track {
    background: #1a1a1a;
}

::-webkit-scrollbar-thumb {
    background: #404040;
    border-radius: 4px;
}

::-webkit-scrollbar-thumb:hover {
    background: #00c896;
}

/* Smooth transitions */
* {
    transition: all 0.2s ease-out;
}

/* Custom animation classes */
@keyframes pulse-soft {
    0%,
    100% {
        opacity: 1;
        transform: scale(1);
    }
    50% {
        opacity: 0.7;
        transform: scale(1.05);
    }
}

@keyframes pulse {
    0%,
    100% {
        opacity: 1;
        transform: scale(1);
    }
    50% {
        opacity: 0.5;
        transform: scale(1.1);
    }
}

@keyframes glow {
    0%,
    100% {
        box-shadow: 0 0 5px rgba(25, 118, 210, 0.5);
    }
    50% {
        box-shadow: 0 0 20px rgba(25, 118, 210, 0.8);
    }
}

.pulse-animation {
    animation: pulse-soft 2s ease-in-out infinite;
}

@keyframes fadeIn {
    from {
        opacity: 0;
        transform: translateY(20px);
    }
    to {
        opacity: 1;
        transform: translateY(0);
    }
}

.fade-in {
    animation: fadeIn 0.6s ease-out;
}

@keyframes slideUp {
    from {
        opacity: 0;
        transform: translateY(30px);
    }
    to {
        opacity: 1;
        transform: translateY(0);
    }
}

.slide-up {
    animation: slideUp 0.4s ease-out;
}

/* Sophisticated background */
body {
    background: #0b0b0b;
}

/* Remove all green backgrounds */
.glass-effect {
    background: rgba(26, 26, 26, 0.8);
    backdrop-filter: blur(20px);
    border: 1px solid rgba(255, 255, 255, 0.1);
}
EOF
Enter fullscreen mode Exit fullscreen mode

Create theme

cat > src/lib/theme.ts << 'EOF'
'use client';
import { createTheme } from '@mui/material/styles';

const theme = createTheme({
palette: {
mode: 'dark',
primary: {
main: '#00C896',
light: '#00D4AA',
dark: '#00B085',
contrastText: '#000000',
},
secondary: {
main: '#FF6B6B',
light: '#FF8A80',
dark: '#F44336',
contrastText: '#FFFFFF',
},
success: {
main: '#00C896',
light: '#00D4AA',
dark: '#00B085',
},
error: {
main: '#FF6B6B',
light: '#FF8A80',
dark: '#F44336',
},
warning: {
main: '#FFB020',
light: '#FFC952',
dark: '#E09600',
},
background: {
default: '#0B0B0B',
paper: '#1A1A1A', // Subtle card backgrounds
},
text: {
primary: '#FFFFFF', // Pure white text
secondary: '#B3B3B3', // Muted gray for secondary text
},
divider: '#2A2A2A',
},
typography: {
fontFamily: '"Inter", -apple-system, BlinkMacSystemFont, "Segoe UI", sans-serif',
h1: {
fontSize: '4rem',
fontWeight: 700,
lineHeight: 1.1,
color: '#FFFFFF',
letterSpacing: '-0.02em',
},
h2: {
fontSize: '3rem',
fontWeight: 700,
lineHeight: 1.2,
color: '#FFFFFF',
letterSpacing: '-0.01em',
},
h3: {
fontSize: '2.25rem',
fontWeight: 600,
lineHeight: 1.3,
color: '#FFFFFF',
},
h4: {
fontSize: '1.875rem',
fontWeight: 600,
lineHeight: 1.4,
color: '#FFFFFF',
},
h5: {
fontSize: '1.5rem',
fontWeight: 500,
lineHeight: 1.5,
color: '#FFFFFF',
},
h6: {
fontSize: '1.25rem',
fontWeight: 500,
lineHeight: 1.5,
color: '#FFFFFF',
},
body1: {
fontSize: '1rem',
lineHeight: 1.6,
color: '#FFFFFF',
},
body2: {
fontSize: '0.875rem',
lineHeight: 1.5,
color: '#B3B3B3',
},
},
shape: {
borderRadius: 8, // More subtle than our previous 12px
},
components: {
MuiCssBaseline: {
styleOverrides: {
body: {
backgroundColor: '#0B0B0B',
color: '#FFFFFF',
},
},
},
MuiCard: {
styleOverrides: {
root: {
backgroundColor: '#1A1A1A',
border: '1px solid #2A2A2A',
borderRadius: 12,
boxShadow: '0 4px 20px rgba(0, 0, 0, 0.3)',
'&:hover': {
boxShadow: '0 8px 32px rgba(0, 0, 0, 0.4)',
transform: 'translateY(-2px)',
transition: 'all 0.3s ease-out',
},
},
},
},
MuiButton: {
styleOverrides: {
root: {
textTransform: 'none',
borderRadius: 24,
padding: '12px 24px',
fontSize: '0.875rem',
fontWeight: 600,
boxShadow: 'none',
'&:hover': {
boxShadow: '0 4px 12px rgba(0, 0, 0, 0.25)',
transform: 'translateY(-1px)',
},
},
contained: {
background: '#00C896', // Solid green, not gradient
color: '#000000',
'&:hover': {
background: '#00B085',
},
},
outlined: {
borderColor: '#2A2A2A',
color: '#B3B3B3',
'&:hover': {
borderColor: '#00C896',
backgroundColor: 'rgba(0, 200, 150, 0.08)',
color: '#00C896',
},
},
},
},
MuiChip: {
styleOverrides: {
root: {
borderRadius: 6,
fontSize: '0.75rem',
fontWeight: 600,
height: 'auto',
padding: '6px 12px',
},
colorSuccess: {
backgroundColor: '#00C896',
color: '#000000',
},
colorError: {
backgroundColor: '#FF6B6B',
color: '#FFFFFF',
},
colorWarning: {
backgroundColor: '#FFB020',
color: '#000000',
},
},
},
MuiTextField: {
styleOverrides: {
root: {
'& .MuiOutlinedInput-root': {
borderRadius: 8,
backgroundColor: '#2A2A2A',
'& fieldset': {
borderColor: '#404040',
},
'&:hover fieldset': {
borderColor: '#00C896',
},
'&.Mui-focused fieldset': {
borderColor: '#00C896',
},
},
'& .MuiOutlinedInput-input': {
color: '#FFFFFF',
},
'& .MuiInputLabel-root': {
color: '#B3B3B3',
},
},
},
},
MuiSlider: {
styleOverrides: {
root: {
color: '#00C896',
'& .MuiSlider-thumb': {
backgroundColor: '#00C896',
border: '2px solid #FFFFFF',
'&:hover': {
boxShadow: '0 0 0 8px rgba(0, 200, 150, 0.16)',
},
},
'& .MuiSlider-track': {
backgroundColor: '#00C896',
},
'& .MuiSlider-rail': {
backgroundColor: '#404040',
},
},
},
},
MuiIconButton: {
styleOverrides: {
root: {
color: '#B3B3B3',
'&:hover': {
backgroundColor: 'rgba(0, 200, 150, 0.08)',
color: '#00C896',
},
},
},
},
},
});

export default theme;
EOF


# Create Footer
cat > src/components/Footer.tsx << 'EOF'
'use client';

import {
    Box,
    Container,
    Grid,
    Typography,
    Link,
    IconButton,
    Divider,
} from '@mui/material';
import { GitHub, LinkedIn, Email, OpenInNew } from '@mui/icons-material';

export function Footer() {
    const techLinks = [
        { name: 'Next.js 14', url: 'https://nextjs.org' },
        { name: 'Material-UI', url: 'https://mui.com' },
  ];

  const powerLinks = [
        { name: 'Google Gemini AI', url: 'https://ai.google.dev' },
        { name: 'LunarCrush MCP', url: 'https://lunarcrush.com' },
    ];


    return (
        <Box component='footer' sx={{ bgcolor: 'grey.900', color: 'grey.300' }}>
            <Container maxWidth='lg' sx={{ py: 6 }}>
                <Grid container spacing={4}>
                    {/* Project Info */}
                    <Grid xs={12} md={6}>
                        <Typography variant='h6' sx={{ color: 'white', mb: 2 }}>
                            Voice Crypto Assistant
                        </Typography>
                        <Typography variant='body2' sx={{ mb: 3, maxWidth: 400 }}>
                            A sophisticated voice-activated cryptocurrency analysis assistant
                            that combines Google Gemini AI, LunarCrush social data, and modern
                            web technologies.
                        </Typography>
                        <Box sx={{ display: 'flex', gap: 1 }}>
                            <IconButton
                                component={Link}
                                href='https://github.com/danilobatson'
                                target='_blank'
                                sx={{ color: 'grey.400', '&:hover': { color: 'white' } }}>
                                <GitHub />
                            </IconButton>
                            <IconButton
                                component={Link}
                                href='https://linkedin.com/in/danilo-batson'
                                target='_blank'
                                sx={{ color: 'grey.400', '&:hover': { color: 'white' } }}>
                                <LinkedIn />
                            </IconButton>
                            <IconButton
                                component={Link}
                                href='mailto:djbatson19@gmail.com'
                                sx={{ color: 'grey.400', '&:hover': { color: 'white' } }}>
                                <Email />
                            </IconButton>
                        </Box>
                    </Grid>

                    {/* Technology Stack */}
                    <Grid xs={12} sm={6} md={3}>
                        <Typography variant='h6' sx={{ color: 'white', mb: 2 }}>
                            Technology
                        </Typography>
                        <Box sx={{ display: 'flex', flexDirection: 'column', gap: 1 }}>
                            {techLinks.map((tech) => (
                                <Link
                                    key={tech.name}
                                    href={tech.url}
                                    target='_blank'
                                    sx={{
                                        color: 'grey.400',
                                        textDecoration: 'none',
                                        display: 'flex',
                                        alignItems: 'center',
                                        gap: 0.5,
                                        '&:hover': { color: 'white' },
                                        fontSize: '0.875rem',
                                    }}>
                                    {tech.name}
                                    <OpenInNew sx={{ fontSize: 12 }} />
                                </Link>
                            ))}
                        </Box>
                    </Grid>

                    <Grid xs={12} sm={6} md={3}>
                        <Typography variant='h6' sx={{ color: 'white', mb: 2 }}>
                            Powered By
                        </Typography>
                        <Box sx={{ display: 'flex', flexDirection: 'column', gap: 1 }}>
                            {powerLinks.map((link) => (
                                <Link
                                    key={link.name}
                                    href={link.url}
                                    target='_blank'
                                    sx={{
                                        color: 'grey.400',
                                        textDecoration: 'none',
                                        display: 'flex',
                                        alignItems: 'center',
                                        gap: 0.5,
                                        '&:hover': { color: 'white' },
                                        fontSize: '0.875rem',
                                    }}>
                                    {link.name}
                                    <OpenInNew sx={{ fontSize: 12 }} />
                                </Link>
                            ))}
                        </Box>
                    </Grid>
                </Grid>

                {/* Bottom Bar */}
                <Divider sx={{ my: 3, bgcolor: 'grey.800' }} />
                <Box
                    sx={{
                        display: 'flex',
                        flexDirection: { xs: 'column', md: 'row' },
                        justifyContent: 'space-between',
                        alignItems: 'center',
                        gap: 2,
                    }}>
                    <Typography variant='body2' sx={{ color: 'grey.400' }}>
                        Β© 2025 Danilo Jamaal Batson. Built with Next.js, Google Gemini AI,
                        and LunarCrush MCP.
                    </Typography>
                    <Box sx={{ display: 'flex', gap: 3 }}>
                        <Link
                            href='https://github.com/danilobatson/voice-crypto-assistant'
                            target='_blank'
                            sx={{
                                color: 'grey.400',
                                textDecoration: 'none',
                                '&:hover': { color: 'white' },
                                fontSize: '0.875rem',
                            }}>
                            View Source
                        </Link>
                        <Link
                            href='https://lunarcrush.com/developers'
                            target='_blank'
                            sx={{
                                color: 'grey.400',
                                textDecoration: 'none',
                                '&:hover': { color: 'white' },
                                fontSize: '0.875rem',
                            }}>
                            API Documentation
                        </Link>
                    </Box>
                </Box>
            </Container>
        </Box>
    );
}
EOF
Enter fullscreen mode Exit fullscreen mode

Testing & Deployment (5 minutes)

Local Testing

# Test your development setup
npm run build

# If build succeeds, start development server
npm run dev
Enter fullscreen mode Exit fullscreen mode

Manual Testing Checklist:

  • βœ… Page loads at localhost:3000
  • βœ… Voice input button responds to clicks
  • βœ… Microphone permission prompt appears
  • βœ… Voice recognition transcribes speech
  • βœ… Edit functionality works immediately
  • βœ… API analysis returns real results
  • βœ… Voice output speaks responses
  • βœ… Mobile responsive design works

Level Up: AI Enhancement Prompts

Ready to extend your Voice Crypto Assistant? Use these prompts with ChatGPT or Claude:

Portfolio Management

"Add portfolio tracking to this voice crypto assistant. Allow users to voice-add positions like 'I bought 2 ETH at $1800' and track profit/loss with the existing MCP analysis framework."
Enter fullscreen mode Exit fullscreen mode

Multi-Crypto Comparison

"Extend this voice assistant to compare multiple cryptocurrencies simultaneously. Add voice commands like 'Compare Bitcoin and Ethereum performance' with side-by-side analysis using the existing MCP tools."
Enter fullscreen mode Exit fullscreen mode

Advanced Voice Controls

"Add advanced voice commands like 'Set price alert for Bitcoin at $50,000' and 'Show me top 5 trending coins' using the existing voice recognition and MCP integration patterns."
Enter fullscreen mode Exit fullscreen mode

Real-time Updates

"Implement WebSocket connections for real-time price updates during voice conversations, integrating with the existing LunarCrush MCP data flow."
Enter fullscreen mode Exit fullscreen mode

Conclusion

Congratulations! You've successfully built a production-ready Voice Crypto Assistant that demonstrates cutting-edge AI development patterns.

What You've Accomplished

  • βœ… Voice-First Interface - Natural speech recognition with intelligent crypto detection
  • βœ… MCP Protocol Integration - Secure AI-to-data connections with LunarCrush
  • βœ… Advanced AI Analysis - Google Gemini 2.0 generating comprehensive market insights
  • βœ… Professional UI - Material-UI dark theme optimized for trading
  • βœ… Smart Editing - Immediate correction capabilities for voice recognition
  • βœ… Advanced Voice Controls - Voice selection, speed control, volume management
  • βœ… Real-time Progress - 4-step animated analysis pipeline

Key Technical Insights

MCP Protocol Benefits Demonstrated:

  • Intelligent Tool Selection - AI chooses optimal data sources dynamically
  • Structured Data Access - Secure, standardized connections to real-time data
  • Protocol-Level Error Handling - Robust connection management and fallbacks

Modern Development Patterns:

  • TypeScript Excellence - Full type safety with advanced interfaces
  • React Performance - Optimized hooks, refs, and state management
  • Voice UI Design - Balancing user experience with technical constraints
  • Error Recovery - Graceful degradation and comprehensive user feedback

What's Next?

Advanced Features:

  • Custom Wake Words - Personalized voice activation commands
  • Enterprise Integration - Slack bots and Teams integration for institutions
  • Mobile App - React Native version with offline capabilities
  • AI Trading Signals - Advanced algorithmic trading recommendations

πŸš€ Take Action

Get Started Now:

  1. Subscribe to LunarCrush API - Access unique social intelligence
  2. Fork the Repository - Build your enhanced version
  3. Deploy Your Own - Launch on AWS Amplify

Learn More:

πŸš€ Complete GitHub Repository (Full Source Code)


Built with ❀️ using LunarCrush MCP β€’ Google Gemini AI β€’ Next.js β€’ Material-UI

Questions? Drop them below! I respond to every comment and love helping fellow developers build amazing voice-powered AI applications. πŸš€

Ready to revolutionize how you interact with cryptocurrency data? Start building your voice-powered crypto assistant today!

Get Started with LunarCrush MCP β†’

Use my discount referral code JAMAALBUILDS to receive 15% off your plan.

Top comments (0)