DEV Community

Qasim Muhammad
Qasim Muhammad

Posted on

Configure AI provider from the Command Line

AI features in the CLI run through configurable providers — local or cloud. nylas ai config sets that up.

The nylas ai config command sets up which AI provider the CLI uses for smart compose, inbox analysis, and natural language scheduling. Choose between local models (Ollama, LM Studio) for privacy or cloud providers (OpenAI, Anthropic) for capability.

Syntax

nylas ai config
Enter fullscreen mode Exit fullscreen mode

Examples

Interactive setup:

nylas ai config
Enter fullscreen mode Exit fullscreen mode

Check current AI provider:

nylas ai config --json
Enter fullscreen mode Exit fullscreen mode

When to Use This

Reach for nylas ai config when configuring which AI model the CLI uses for smart features. Combine with --json to pipe output into other tools.

How It Works

AI features in the CLI are provider-agnostic. Choose Ollama for local inference (your data never leaves your machine), or use OpenAI/Anthropic APIs for more capable models. The CLI handles the provider abstraction.


Full docs: nylas ai config reference — all flags, advanced examples, and troubleshooting.

All commands: Nylas CLI Command Reference

Get started: brew install nylas/nylas-cli/nylasother install methods

Top comments (0)