DEV Community

GPTLocalhost
GPTLocalhost

Posted on • Edited on • Originally published at gptlocalhost.com

Use OpenLLM in Microsoft Word Locally. No Recurring Inference Costs.

Looking for a Microsoft Copilot alternative without recurring inference costs? You can achieve this in Microsoft Word by utilizing OpenLLM and local LLMs. OpenLLM lets you easily use both open-source and custom models through OpenAI-compatible APIs with just one command. It includes a ready-to-use chat UI, advanced inference technology, and makes it simple to set up enterprise-level cloud deployments using tools like Docker, Kubernetes, and BentoCloud.

Here’s a quick demonstration of how it works using OpenLLM within Microsoft Word locally — and all without recurring inference costs. For further examples, visit our video library at @GPTLocalhost!

Top comments (0)

The discussion has been locked. New comments can't be added.

Billboard image

The Next Generation Developer Platform

Coherence is the first Platform-as-a-Service you can control. Unlike "black-box" platforms that are opinionated about the infra you can deploy, Coherence is powered by CNC, the open-source IaC framework, which offers limitless customization.

Learn more