DEV Community

Cover image for LiteLLM Proxy Setup Guide for macOS with UV (Beginner-Friendly)
SilverFox AI
SilverFox AI

Posted on

LiteLLM Proxy Setup Guide for macOS with UV (Beginner-Friendly)

🎯 Goal: Quickly set up LiteLLM proxy on Mac in 15-20 minutes using the modern UV package manager without creating a formal project structure.

πŸ“‹ What You'll Need

  • Mac with macOS 10.15 or newer
  • Internet connection
  • 20 minutes of your time
  • One API key (OpenAI, Anthropic, Groq, or other)

πŸ›  Step 1: Install Required Tools

1.1 Install Homebrew (if not already installed)

Open Terminal and run:

# Check if Homebrew is installed
brew --version

# If you get an error, install Homebrew:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
Enter fullscreen mode Exit fullscreen mode

1.2 Install UV (modern Python package manager)

# Install UV via Homebrew
brew install uv

# Verify installation
uv --version
Enter fullscreen mode Exit fullscreen mode

1.3 Install PostgreSQL

# Install PostgreSQL
brew install postgresql@14

# Start PostgreSQL service
brew services start postgresql

# Verify it's running
brew services list | grep postgresql
Enter fullscreen mode Exit fullscreen mode

πŸ’Ύ Step 2: Create Database

# Create database for LiteLLM
createdb litellm_db

# Verify database was created
psql -l | grep litellm_db

# Get your username for database connection
whoami
Enter fullscreen mode Exit fullscreen mode

Write down the result of whoami - this is your database username!

🐍 Step 3: Set Up Virtual Environment

Instead of initializing a full project, we'll create a simple directory with a virtual environment.

# Go to your preferred directory, e.g., Documents
cd ~/Documents

# Create a folder for our setup and navigate into it
mkdir litellm-setup
cd litellm-setup

# Create a virtual environment using UV
uv venv

# Activate the virtual environment
source .venv/bin/activate

# Your terminal prompt should now start with (.venv)
Enter fullscreen mode Exit fullscreen mode

⚠️ Important: All subsequent commands should be run inside this activated environment.

πŸ“¦ Step 4: Install LiteLLM and Dependencies

# Install LiteLLM with proxy support and database drivers
uv pip install "litellm[proxy]" prisma psycopg2-binary

# Verify installation (since the environment is active, you don't need "uv run")
litellm --version
Enter fullscreen mode Exit fullscreen mode

πŸ”§ Step 5: Generate Prisma Client

LiteLLM uses Prisma for database interactions. We need to generate the Python client:

# Find the exact path to schema.prisma within the installed litellm package
find .venv -name "schema.prisma" -path "*/litellm/*"

# Generate Prisma client (replace the path with the output from the command above)
python -m prisma generate --schema .venv/lib/python3.12/site-packages/litellm/proxy/schema.prisma
Enter fullscreen mode Exit fullscreen mode

Important: Your Python version (e.g., python3.12) may differ. Use the find command to get the correct path.

βš™οΈ Step 6: Create Configuration

6.1 Create config.yaml

# Create configuration file
nano config.yaml
Enter fullscreen mode Exit fullscreen mode

Insert the following content:

litellm_settings:
  drop_params: true

general_settings:
  # Main settings will be taken from environment variables

# List of available models
model_list:
  - model_name: groq-llama3-fast        # Your custom alias (any name)
    litellm_params:
      model: groq/llama3-8b-8192        # Real model name from LiteLLM
      api_key: os.environ/GROQ_API_KEY  # Key via environment variables

  - model_name: gpt-3.5-turbo           # Additional example with OpenAI
    litellm_params:
      model: gpt-3.5-turbo
      api_key: os.environ/OPENAI_API_KEY
Enter fullscreen mode Exit fullscreen mode

Save file: Ctrl+O, then Enter, then Ctrl+X

6.2 Create environment variables file

# Create .env file
nano .env
Enter fullscreen mode Exit fullscreen mode

Insert (replace YOUR_USERNAME with the result of the whoami command):

# LiteLLM settings
LITELLM_MASTER_KEY="sk-1234"
LITELLM_SALT_KEY="sk-1234"
LITELLM_DROP_PARAMS=True

# Server settings
PORT=4000
STORE_MODEL_IN_DB="True"  # Allow storing models in DB

# Database connection (replace 'admin' with YOUR_USERNAME if it's different)
DATABASE_URL=postgresql://admin@localhost:5432/litellm_db

# API keys (add your real keys)
GROQ_API_KEY="your-groq-api-key-here"
OPENAI_API_KEY="your-openai-key-here"
ANTHROPIC_API_KEY="your-anthropic-key-here"
Enter fullscreen mode Exit fullscreen mode

Save file: Ctrl+O, then Enter, then Ctrl+X

πŸš€ Step 7: Start LiteLLM

# Start LiteLLM proxy (it will automatically load the .env file)
litellm --config config.yaml --port 4000

# On the first run, LiteLLM will automatically create tables in the database

# You'll see:
# LiteLLM: Proxy initialized with Config, Set models:...
# INFO:     Uvicorn running on http://0.0.0.0:4000
Enter fullscreen mode Exit fullscreen mode

πŸŽ‰ Step 8: Verify Everything Works

8.1 Open UI

In your browser, go to: http://localhost:4000

You'll see the LiteLLM web interface with model management, keys, and statistics!

8.2 Test Model via UI

  1. In the web interface, find the "Test Key" section.
  2. Select the groq-llama3-fast model.
  3. Write a test message, e.g.: "Hello! How are you?"
  4. Click send.
  5. You'll get a response from the model!

8.3 Test API via Terminal

Open a new terminal and run:

curl -X POST 'http://localhost:4000/chat/completions' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer sk-1234' \
-d '{
  "model": "groq-llama3-fast",
  "messages": [{"role": "user", "content": "Hello! How are you?"}]
}'
Enter fullscreen mode Exit fullscreen mode

If everything works correctly, you'll get a response from the Groq Llama3 model!


πŸ—„οΈ LiteLLM Database Management

After starting the LiteLLM Proxy, it will begin storing data in your PostgreSQL database. Here's how you can manage this data.

Checking Database Content

To view the tables and data in your litellm_db database:

  1. Connect to the litellm_db database using psql:

    psql -U admin -h localhost -p 5432 litellm_db
    

    Enter your password when prompted.

  2. View the list of tables:

    \dt
    

    You will see a list of tables created by LiteLLM, such as:
    LiteLLM_AuditLog, LiteLLM_BudgetTable, LiteLLM_Config, LiteLLM_CredentialsTable, LiteLLM_CronJob, LiteLLM_DailyTagSpend, LiteLLM_DailyTeamSpend, LiteLLM_DailyUserSpend, LiteLLM_EndUserTable, LiteLLM_ErrorLogs, LiteLLM_GuardrailsTable, LiteLLM_HealthCheckTable, LiteLLM_InvitationLink, LiteLLM_MCPServerTable, LiteLLM_ManagedFileTable, LiteLLM_ManagedObjectTable, LiteLLM_ManagedVectorStoresTable, LiteLLM_ModelTable, LiteLLM_ObjectPermissionTable, LiteLLM_OrganizationMembership, LiteLLM_OrganizationTable, LiteLLM_PromptTable, LiteLLM_ProxyModelTable, LiteLLM_SpendLogs, LiteLLM_TeamMembership, LiteLLM_TeamTable, LiteLLM_UserNotifications, LiteLLM_UserTable, LiteLLM_VerificationToken, _prisma_migrations.

  3. View the schema of a specific table (optional):
    For example, for the LiteLLM_SpendLogs table:

    \d "LiteLLM_SpendLogs"
    

    (Use double quotes if the table name contains uppercase letters or special characters).

  4. View a few records from a table (optional):
    For example, for the LiteLLM_SpendLogs table:

    SELECT * FROM "LiteLLM_SpendLogs" LIMIT 5;
    
  5. Exit psql:

    \q
    

Deleting the Database

If you want to completely delete the litellm_db database (e.g., to start with a clean slate), follow these steps. This operation is irreversible and will result in the loss of all data in this database.

  1. Connect to PostgreSQL to any other database (e.g., postgres), as you cannot delete a database you are currently connected to:

    psql -U admin -h localhost -p 5432 postgres
    

    Enter your password.

  2. Delete the litellm_db database:

    DROP DATABASE litellm_db;
    
  3. Exit psql:

    \q
    

    The next time LiteLLM starts, it will automatically recreate the litellm_db database and apply all migrations.

Clearing Data (Logs, Spend, Tokens)

If you want to clear only specific data (e.g., logs or spend records) without deleting the entire database, you can use the TRUNCATE TABLE command. This will remove all records from the table but leave the table itself and its structure intact.

Important: Make sure the LiteLLM Proxy is not running when you execute these commands to avoid conflicts.

  1. Connect to the litellm_db database using psql:

    psql -U admin -h localhost -p 5432 litellm_db
    

    Enter your password.

  2. Clear the desired tables:

*   **Request and error logs:**
Enter fullscreen mode Exit fullscreen mode
    ```sql
    TRUNCATE TABLE "LiteLLM_AuditLog" RESTART IDENTITY;
    TRUNCATE TABLE "LiteLLM_ErrorLogs" RESTART IDENTITY;
    ```
Enter fullscreen mode Exit fullscreen mode
    (`RESTART IDENTITY` resets auto-incrementing ID counters, if any).

*   **Detailed spend records (price, tokens):**
Enter fullscreen mode Exit fullscreen mode
    ```sql
    TRUNCATE TABLE "LiteLLM_SpendLogs" RESTART IDENTITY;
    ```
Enter fullscreen mode Exit fullscreen mode
*   **Aggregated daily spend (users, teams, tags):**
Enter fullscreen mode Exit fullscreen mode
    ```sql
    TRUNCATE TABLE "LiteLLM_DailyUserSpend" RESTART IDENTITY;
    TRUNCATE TABLE "LiteLLM_DailyTeamSpend" RESTART IDENTITY;
    TRUNCATE TABLE "LiteLLM_DailyTagSpend" RESTART IDENTITY;
    ```
Enter fullscreen mode Exit fullscreen mode
*   **Other tables that may contain data you want to clear (depends on your needs):**
    *   `LiteLLM_HealthCheckTable`
    *   `LiteLLM_CronJob`
    *   `LiteLLM_UserNotifications`
Enter fullscreen mode Exit fullscreen mode
  1. Exit psql:

    \q
    

πŸš€ Step 9: Create Alias for Convenience (Optional)

To run LiteLLM from any folder, create an alias:

9.1 Add alias to your shell

# Open shell configuration file (for zsh)
nano ~/.zshrc

# Or for bash:
# nano ~/.bash_profile
Enter fullscreen mode Exit fullscreen mode

Add at the end of the file (replace /Users/admin/Documents/litellm-setup with your actual path):

# LiteLLM alias for quick start
alias litellm-start="cd /Users/admin/Documents/litellm-setup && source .venv/bin/activate && litellm --config config.yaml --port 4000"
Enter fullscreen mode Exit fullscreen mode

9.2 Reload configuration

# For zsh:
source ~/.zshrc

# For bash:
# source ~/.bash_profile
Enter fullscreen mode Exit fullscreen mode

9.3 Now you can start from anywhere:

# From any folder:
litellm-start

# LiteLLM will start automatically!
Enter fullscreen mode Exit fullscreen mode

πŸ†˜ Troubleshooting

Problem: "command not found: litellm"

Solution: You have forgotten to activate the virtual environment. Run this command from your project folder:

source .venv/bin/activate
Enter fullscreen mode Exit fullscreen mode

After this, the litellm command will be available.

Problem: "ModuleNotFoundError: No module named 'prisma'"

Solution: Your installation was likely incomplete. Make sure prisma is installed in your active environment:

# Make sure .venv is active
uv pip install prisma
Enter fullscreen mode Exit fullscreen mode

Problem: "The Client hasn't been generated yet..."

Solution: Run the Prisma generate command again, ensuring the path to schema.prisma is correct.

Problem: "Database connection failed" or "Database not connected"

Solution:

# Check if PostgreSQL is running
brew services list | grep postgresql

# If not running:
brew services start postgresql

# Check if database exists:
psql -l | grep litellm_db

# Ensure the DATABASE_URL in your .env file is correct (username, database name).
Enter fullscreen mode Exit fullscreen mode

Problem: "Config file not found"

Solution: Make sure the config.yaml file is in the same directory from which you are running the litellm command.

Problem: "Invalid API key"

Solution: Double-check the API keys in your .env file.

Problem: "Port 4000 already in use"

Solution: Use a different port:

litellm --config config.yaml --port 4001
Enter fullscreen mode Exit fullscreen mode

🎯 What You've Achieved

βœ… LiteLLM proxy server on localhost:4000

βœ… Web UI for model management

βœ… Unified API for all LLM providers

βœ… Ready-to-share project setup

βœ… Modern UV Python ecosystem

βœ… Database integration for advanced features

βœ… Convenient alias for daily use

Congratulations! You can now use any LLM through a single interface! πŸš€


πŸ’‘ Why This Setup is Powerful

  • Unified Interface: Access OpenAI, Anthropic, Groq, and 100+ other models through one API
  • Cost Tracking: Built-in spend monitoring and budget controls
  • Rate Limiting: Prevent API abuse and manage usage
  • Modern Tooling: UV provides faster, more reliable package management
  • Production Ready: PostgreSQL backend for scaling
  • Developer Friendly: Easy to share and reproduce across teams

πŸ”„ Next Steps

  1. Add more models to your config.yaml
  2. Create API keys with different permissions and limits
  3. Set up monitoring and alerts
  4. Integrate with your applications using the OpenAI-compatible API
  5. Scale by deploying to cloud platforms

Happy coding! πŸŽ‰

Top comments (0)