DEV Community

LocPilot
LocPilot

Posted on • Originally published at locpilot.com

Use Both Local and Cloud LLMs in Microsoft Word — Seamlessly

Most people think they have to choose between the privacy of local AI models and the power of cloud LLMs. But with the right setup, you can actually use both inside Microsoft Word — and switch between them effortlessly.

A Flexible Setup: Local + Cloud

Modern teams increasingly need the privacy of on-premise models for sensitive documents, while still relying on advanced cloud models for heavier work. With today’s tooling, you don’t need to lock yourself into one side. You can configure Microsoft Word to access local LLMs running on your machine while also tapping into cloud models from OpenAI, Anthropic, or others whenever you need more horsepower.

LiteLLM: The Bridge That Makes It Work

The key to this hybrid workflow is LiteLLM. It serves as a unified gateway that lets you route prompts to different models — local or cloud — through one consistent API. Your Word setup doesn’t need to know which model is running or where. LiteLLM takes care of the complexity. It essentially centralizes all your LLM endpoints so Word can operate with them as if they were one.

LocPilot: A Local Word Add-In That Connects LiteLLM

Once LiteLLM is running, LocPilot for Word connects directly to it as a local Word Add-in. This allows you to:

  • Use your local LLMs with complete privacy

  • Switch to cloud LLMs when you need models with exceptional strength and complexity

  • Maintain a single, familiar interface inside Microsoft Word

Because LocPilot for Word communicates with LiteLLM, you can select which models to use easily for different situations.

Privacy When It Matters, Power When You Need It

This hybrid approach gives you the best of both worlds:

  • Local inference keeps sensitive documents on your machine

  • Cloud inference is always available when you need larger models — only the text you choose is uploaded

  • Instant switching makes the experience seamless

  • Zero monthly subscription fees — you rely on your own hardware plus whatever pay-as-you-go cloud tokens you use

It’s a privacy-first, cost-effective, and flexible alternative to Microsoft Copilot in Word.

A Smarter Path Forward

Instead of choosing between privacy and capability, you can have both. By combining LiteLLM with LocPilot for Word, Microsoft Word becomes a powerful AI-assisted toolkit that adapts to your workload and keeps your data under your control.

If you want a setup that balances securityperformance, and affordability, this local+cloud hybrid approach is one of the most effective ways to work with LLMs in Word today.

Here is a quick demo. The demo is powered by GPTLocalhost, which offers the same core features for individual use. LocPilot for Word is the Intranet edition of GPTLocalhost designed for enterprise users and team collaboration.

For more creative uses of local and private LLMs in Microsoft Word, explore additional demos available on our channel at @LocPilot. We hope this demonstration sparks new ideas and encourages you to create further applications that equip your team with cutting-edge on-premise AI capabilities. If you have any particular suggestions or concepts, please feel free to reach out to us at info@locpilot.com. Our mission is to make local AI more accessible and affordable for teamwork.

Top comments (0)

The discussion has been locked. New comments can't be added.