DEV Community

Cover image for How to Plug AIsa into your Hermes Agent in 2 Minutes (Without Rebuilding Your Setup)
Femi Raphael
Femi Raphael

Posted on

How to Plug AIsa into your Hermes Agent in 2 Minutes (Without Rebuilding Your Setup)

Most people make Hermes way harder to run than it needs to be.
They get the runtime working, but then they start changing providers, changing model configs, changing endpoints, and before long the setup itself becomes the problem.

The easier approach is to keep Hermes pointed at one OpenAI-compatible layer and switch models from there.

That is exactly where AIsa fits.

Hermes already supports any OpenAI-compatible API as a custom provider. AIsa is OpenAI-compatible and uses a single base URL at https://api.aisa.one/v1, so the integration path is just the normal Hermes custom endpoint flow.

  • No special adapters
  • No hacky workarounds

Here are the two ways to set it up.

Method 1: The CLI Setup (Quickest)

The fastest way to get routing is through the Hermes CLI.

  1. Generate your API key on the AIsa dashboard

  2. Run hermes model in your terminal

  3. Choose Custom endpoint

  4. Enter the AIsa base URL: https://api.aisa.one/v1

  5. Enter your AIsa API key (the one you generated from the marketplace dashboard)

  6. Pick the model you want Hermes to call (e.g., qwen-3.6-plus or claude-opus-4-6)

Hermes’ own quickstart supports this flow, meaning you can easily switch providers or models anytime just by running hermes model again.

Method 2: The Config Setup
If you prefer keeping things in code rather than using the setup wizard, Hermes supports the custom provider path directly in your config file.

Open up ~/.hermes/config.yaml and drop this in:
YAML

model:
  provider: custom
  base_url: https://api.aisa.one/v1
  api_key: YOUR_AISA_API_KEY
  default: YOUR_MODEL_NAME
Enter fullscreen mode Exit fullscreen mode

Why do it this way?

The benefit here is simple. AIsa gives you one base URL and one API key for accessing every major model, while keeping the request format strictly OpenAI-compatible.

You also get provider-agnostic usage tracking and unified billing. This means you aren't rebuilding your Hermes setup or managing five different API dashboards every time you want to test if Llama 3 handles a task better than GPT-4o.

A Quick Reality Check on Fallbacks

If you are doing this for reliability, Hermes recently added support for ordered fallback provider chains through fallback_providers in the config file.

NB: here are currently a few fresh bug reports showing fallback issues on the Hermes API server path.

So the cleanest, most stable recommendation right now is:

  1. Start with AIsa as your Hermes custom provider
  2. Get your primary model working stably first
  3. Add the fallback logic later once the base path is solid

This gives you a much simpler setup, better model switching, and way less provider churn while you’re building.


(P.S. AIsa gives new users free API credits, so you can test this exact routing setup on your agent right now without putting in a card).

Top comments (0)