Introduction
MoodMatch is an AI-powered agent that analyzes your emotional state and provides personalized recommendations for music, movies, and books. Built for the HNG Stage 3 Backend Task, it demonstrates the power of the A2A (Agent-to-Agent) protocol and AI integration.
What MoodMatch Does
- Analyzes user messages to detect emotions
 - Supports 52 different mood categories
 - Recommends music from Spotify
 - Suggests movies from TMDB
 - Recommends books from Google Books
 - Provides direct, clickable links to all recommendations
 
Technical Stack
- Backend: Python + FastAPI
 - AI: Google Gemini 2.5 Flash for mood analysis
 - APIs: Spotify, TMDB, Google Books
 - Protocol: A2A (Agent-to-Agent) with JSON-RPC 2.0
 - Deployment: Leapcell (serverless platform)
 - Integration: Telex.im messaging platform
 
Key Features
- Smart Mood Detection: Even if you say "I need money," it understands you're stressed
 - 52 Mood Categories: From happy to bittersweet, covering complex emotions
 - Multi-source Recommendations: Music, movies, and books in one response
 - Context-Aware: Considers time of day and emotion intensity
 - Direct Links: Click and enjoy your recommendations immediately
 
Technical Challenges & Solutions
Challenge 1: A2A Protocol Learning Curve
The A2A protocol uses JSON-RPC 2.0, which was new to me. Solution: Studied the spec, implemented proper request/response handling, and tested with Postman.
Challenge 2: Mood Detection Accuracy
Getting AI to map free-form text to specific moods was tricky. Solution: Used Gemini 2.5 Flash with structured output and fuzzy matching for fallback.
    
Top comments (0)