DEV Community

Ekrem MUTLU
Ekrem MUTLU

Posted on

How I Built Identity-as-a-Service for AI Agents

How I Built Identity-as-a-Service for AI Agents: A Behind-the-Scenes Look at EqhoIDs

Hey Dev.to community! 👋

I've been working on a pretty exciting project lately, and I wanted to share the journey with you. It's called EqhoIDs, and it's essentially Identity-as-a-Service (IDaaS) tailored specifically for AI agents. Think of it as giving your AI agents verifiable, persistent, and unique identities that they can use across different platforms and applications.

Why build this? Well, as AI agents become more sophisticated and integrated into our lives, the need for them to have a reliable identity becomes paramount. Imagine an AI assistant making appointments, ordering goods, or even interacting with other AI agents. Without a solid identity, things can quickly become chaotic. We need to know who is doing what.

This article dives into the technical details of building EqhoIDs, covering the tech stack, challenges faced, and lessons learned. Buckle up!

The Vision: Identity for the AI Age

Before we dive into the code, let's clarify the core goals of EqhoIDs:

  • Unique Identification: Each AI agent should have a unique and persistent identifier.
  • Authentication & Authorization: Agents should be able to prove their identity and be granted access to resources based on their roles.
  • Verifiable Credentials: Agents should be able to hold and present verifiable credentials, allowing them to prove specific attributes (e.g., "This agent is authorized to make payments.").
  • Scalability & Reliability: The system should be able to handle a large number of agents and requests with minimal downtime.

The Tech Stack

To bring this vision to life, I chose the following technologies:

  • Backend: FastAPI (Python)
  • Database: PostgreSQL
  • Caching: Redis
  • Email Verification: Cloudflare Email Workers
  • Voice Synthesis: ElevenLabs

Let's break down why I chose each of these:

FastAPI: The Pythonic Powerhouse

FastAPI is a modern, high-performance web framework for building APIs with Python. Its key advantages include:

  • Speed: Built on top of Starlette and Pydantic, FastAPI is incredibly fast.
  • Automatic Data Validation: Pydantic handles data validation and serialization, reducing boilerplate code.
  • Automatic API Documentation: FastAPI automatically generates interactive API documentation using OpenAPI (Swagger UI). This is a huge time-saver!

Here's a simple example of a FastAPI endpoint:

from fastapi import FastAPI

app = FastAPI()

@app.get("/agents/{agent_id}")
async def get_agent(agent_id: int):
    # Fetch agent data from the database
    agent_data = {"id": agent_id, "name": "Agent " + str(agent_id)}
    return agent_data
Enter fullscreen mode Exit fullscreen mode

PostgreSQL: The Reliable Relational Database

PostgreSQL is a powerful, open-source relational database system known for its reliability, data integrity, and advanced features. I chose it for:

  • Data Integrity: PostgreSQL's ACID compliance ensures data consistency.
  • Extensibility: It supports a wide range of data types and extensions.
  • Scalability: PostgreSQL can be scaled both vertically and horizontally.

I used SQLAlchemy as an ORM (Object-Relational Mapper) to interact with the database. This allowed me to define my data models in Python and abstract away the complexities of writing raw SQL queries.

from sqlalchemy import create_engine, Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker

DATABASE_URL = "postgresql://user:password@host:port/database"

engine = create_engine(DATABASE_URL)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)

Base = declarative_base()

class Agent(Base):
    __tablename__ = "agents"

    id = Column(Integer, primary_key=True, index=True)
    name = Column(String)
    email = Column(String, unique=True, index=True)

Base.metadata.create_all(bind=engine)

# Example of creating a new agent
def create_agent(db: SessionLocal, name: str, email: str):
    db_agent = Agent(name=name, email=email)
    db.add(db_agent)
    db.commit()
    db.refresh(db_agent)
    return db_agent
Enter fullscreen mode Exit fullscreen mode

Redis: The Speedy Cache

Redis is an in-memory data store that I used for caching frequently accessed data, such as agent profiles and authentication tokens. This significantly improved the performance of the API.

  • Speed: Redis is extremely fast due to its in-memory nature.
  • Data Structures: It supports various data structures, including strings, hashes, lists, and sets.
  • Pub/Sub: Redis's pub/sub capabilities can be used for real-time communication between components.
import redis

redis_client = redis.Redis(host='localhost', port=6379, db=0)

# Example of caching agent data
def get_agent_from_cache(agent_id: int):
    agent_data = redis_client.get(f"agent:{agent_id}")
    if agent_data:
        return agent_data.decode('utf-8')
    return None

def set_agent_in_cache(agent_id: int, agent_data: str):
    redis_client.set(f"agent:{agent_id}", agent_data)
Enter fullscreen mode Exit fullscreen mode

Cloudflare Email Workers: Verification Made Easy

To verify agent email addresses, I used Cloudflare Email Workers. These serverless functions allowed me to intercept incoming emails, extract the verification code, and update the agent's status in the database.

  • Serverless: No need to manage servers or infrastructure.
  • Scalable: Cloudflare handles the scaling automatically.
  • Cost-Effective: Pay only for the resources you use.

The Email Worker script would parse the email, extract the verification link, and then make a request to a FastAPI endpoint to confirm the agent's email.

ElevenLabs: Giving Agents a Voice

For certain use cases, I wanted AI agents to be able to communicate using voice. ElevenLabs provides a powerful text-to-speech API that allowed me to generate realistic and expressive voices for the agents.

  • High-Quality Voices: ElevenLabs offers a wide range of natural-sounding voices.
  • Customization: You can customize the voice parameters, such as pitch, speed, and accent.
  • Easy Integration: The API is straightforward to use.
import requests

XI_API_KEY = "YOUR_ELEVENLABS_API_KEY"
VOICE_ID = "pNInz6obpgDQGcFmaJgB"

url = f"https://api.elevenlabs.io/v1/text-to-speech/{VOICE_ID}"

headers = {
  "xi-api-key": XI_API_KEY,
  "Content-Type": "application/json"
}

def text_to_speech(text: str):
    data = {
        "text": text,
        "model_id": "eleven_monolingual_v1",
        "voice_settings": {
            "stability": 0.5,
            "similarity_boost": 0.5
        }
    }
    response = requests.post(url, headers=headers, json=data, stream=True)

    if response.status_code == 200:
        with open("output.mp3", "wb") as f:
            for chunk in response.iter_content(chunk_size=1024):
                if chunk:
                    f.write(chunk)
        return "output.mp3" # Return the file path
    else:
        print(f"Error: {response.status_code} - {response.text}")
        return None
Enter fullscreen mode Exit fullscreen mode

Challenges and Lessons Learned

Building EqhoIDs wasn't without its challenges. Here are a few things I learned along the way:

  • Security is paramount: Implementing robust authentication and authorization mechanisms is crucial. I spent a significant amount of time researching and implementing best practices for securing the API.
  • Scalability considerations: Designing the system with scalability in mind from the beginning is essential. Using caching and choosing the right database are key factors.
  • Email deliverability: Ensuring that verification emails are delivered reliably can be tricky. I had to configure SPF, DKIM, and DMARC records to improve deliverability.
  • AI voice consistency: Getting the AI voice to sound consistent across different text inputs required careful tuning of the ElevenLabs voice parameters.

Conclusion

Building EqhoIDs has been a rewarding experience. It's a complex project that touches on several interesting areas, including AI, identity management, and distributed systems. I'm excited to see how AI agents will evolve in the future and how EqhoIDs can play a role in enabling a more secure and trustworthy ecosystem.

I hope this behind-the-scenes look has been helpful. I'm always open to feedback and suggestions, so please feel free to leave your comments below! What are your thoughts on AI agent identity? What other technologies would you consider for this project?

Thanks for reading!

Top comments (0)