DEV Community

Cover image for Local-First AI Astrologer: Building a Private, Multimodal Vedic AI Agent🔮
Roshni
Roshni

Posted on

Local-First AI Astrologer: Building a Private, Multimodal Vedic AI Agent🔮

Where Ancient Wisdom Meets Local AI

What if your birth chart could be interpreted by an AI that actually reads classical Vedic texts, understands planetary mathematics, analyzes your palm through a camera — and does it all without sending your data to the cloud?

As part of the Vision Possible Hackathon by VisionAgents AI, I explored a bold idea:

Can we build a culturally intelligent, multimodal AI agent that runs locally and respects user privacy?

The result is Local-First AI Astrologer — an open-source voice + vision AI system that delivers personalized Vedic astrology readings completely on-device.

Table of Contents

  1. The Vision Behind the Project
  2. Core Features
  3. Birth Chart Intelligence
  4. Kundli Matching
  5. AI Palm Reading
  6. Tech Stack Deep Dive
  7. How Local RAG Ensures Privacy
  8. Setup & Installation
  9. Demo
  10. Challenges & Learnings
  11. Why This Matters
  12. Conclusion

The Vision Behind the Project

Most astrology platforms today are:

  • Static
  • Generic
  • Cloud-dependent
  • Non-explainable

Vedic astrology, however, is deeply structured — Nakshatra systems, Dashas, planetary transits, Guna Milan compatibility — yet few tools provide contextual reasoning grounded in authentic texts.

I wanted to build something different:

  • Retrieval-grounded reasoning
  • Real-time voice interaction
  • Vision-powered palm reading
  • Fully local architecture ## Core Features

Birth Chart Intelligence

The AI generates personalized analysis including:

  • Nakshatra interpretation
  • Rashi (Moon sign) explanation
  • Mahadasha timeline breakdown
  • Planetary transit insights
  • Traditional remedies

Powered by:

ephem for astronomical calculations
Local RAG over curated Vedic astrology PDFs
FAISS vector search
Sentence-transformer embeddings

Instead of hallucinating, the system retrieves knowledge from classical texts and explains it conversationally.

Kundli Matching

The compatibility engine performs:

Guna Milan scoring
Dosha detection
Contextual compatibility reasoning

It doesn’t just output numbers — it explains the relationship dynamics in natural language.

AI Palm Reading (Vision + RAG)

Hold your hand up to the camera and the system:

  • Detects major lines
  • Identifies mounts
  • Classifies hand shape
  • Maps features to palmistry knowledge base

No NVIDIA APIs.
No heavy cloud inference.

Just lightweight vision integration combined with local retrieval.

Tech Stack Deep Dive

Built 100% in Python.

Layer - Technology
Embeddings - sentence-transformers
Video Stream - GetStream.io
Vector Store - FAISS
Knowledge Base - Vedic Astrology PDFs + Palmistry for All
Astronomy Engine - ephem
Voice - Gemini Realtime / Deepgram / ElevenLabs
Vision - Camera-based processing
RAG Setup - setup_rag.py
Agent Runtime - agent.py

The architecture supports:

Buffered conversational responses
Optional voice/video modes
Fully local document indexing

How Local RAG Ensures Privacy

Most AI tools send user inputs to cloud APIs.

This system:

  • Uses local embeddings
  • Runs FAISS on-device
  • Performs scoped document retrieval
  • Avoids external knowledge calls

Your:

  • Birth date
  • Time & location
  • Compatibility inputs
  • Palm images

Never leave your device.
This is privacy-first multimodal AI.

Setup & Installation

Clone the repository:

git clone https://github.com/SpandanM110/Local-First-AI-Astrologer
cd Local-First-AI-Astrologer
pip install -r requirements.txt
Enter fullscreen mode Exit fullscreen mode

Then:

Add API keys in .env
Place PDFs inside /knowledge
Run:

python setup_rag.py
python agent.py run

Enter fullscreen mode Exit fullscreen mode

Enable your camera for palm reading

Text-only mode works as well.

Demo

Watch the full demo here: https://youtu.be/q6vUcWZL22E?si=wQS-jXx4LDpHFer7

The demo showcases:

  • Live birth chart reasoning
  • Voice interaction
  • Real-time palm detection
  • Smooth buffered responses

Challenges & Learnings

  1. Precision in Astronomical Computation: Small calculation differences significantly impact Dasha timelines.

2️. Reducing Hallucinations: RAG dramatically improved factual grounding.

3️. Lightweight Vision: Building palm reading without expensive GPU dependencies was critical.

Big takeaway: Multimodal AI becomes powerful when retrieval is precise and scoped.

Why This Matters

This project explores three important AI trends:

Local-First AI → Privacy by design
Multimodal Agents → Voice + Vision + Retrieval
Cultural Intelligence Systems → Domain-specialized reasoning

Instead of one giant generalized assistant, the future may belong to hyper-specialized, privacy-preserving agents.

Conclusion

Local-First AI Astrologer is more than a hackathon project — it’s a prototype for what personal AI agents can become:

Context-aware
Culturally grounded
Multimodal
Privacy-respecting

If you're excited about:

  • AI agents
  • RAG architectures
  • Local-first systems
  • Multimodal experimentation

⭐Star the repo
🍴Fork it
💬Share feedback

GitHub: https://github.com/SpandanM110/Local-First-AI-Astrologer

Top comments (0)