DEV Community

Cover image for Quick tip: Running OpenAI's Swarm locally using Ollama
Akmal Chaudhri for SingleStore

Posted on • Edited on

Quick tip: Running OpenAI's Swarm locally using Ollama

Abstract

This short article shows how to integrate OpenAI's Swarm with Ollama without requiring direct OpenAI access, enabling efficient, scalable AI workflows using alternative large language models.

The notebook file used in this article is available on GitHub.

Introduction

In a previous article, we used OpenAI's Swarm, but required an OpenAI API Key and use of an OpenAI large language model. However, Swarm can be easily configured to use Ollama and alternative large language models. In this article, we'll see how.

Installing Ollama is quick and easy. On Linux, for example, we can use the following command:

curl -fsSL https://ollama.com/install.sh | sh
Enter fullscreen mode Exit fullscreen mode

With Ollama installed and running, we'll launch Jupyter locally:

jupyter notebook
Enter fullscreen mode Exit fullscreen mode

Fill out the notebook

We'll then create a new notebook and first install the following:

!pip install git+https://github.com/openai/swarm.git --quiet
!pip install ollama openai --quiet
Enter fullscreen mode Exit fullscreen mode

Next, we'll import the following:

import ollama

from openai import OpenAI
Enter fullscreen mode Exit fullscreen mode

and then download a model using Ollama:

model = "llama3.2:1b"

ollama.pull(model)
Enter fullscreen mode Exit fullscreen mode

Ollama can be configured to provide OpenAI compatibility, as follows:

ollama_client = OpenAI(
    base_url = "http://localhost:11434/v1",
    api_key = "ollama"
)
Enter fullscreen mode Exit fullscreen mode

As a quick example, we'll slightly modify the Swarm demo provided by OpenAI on GitHub:

from swarm import Swarm, Agent

client = Swarm(ollama_client)

def transfer_to_agent_b():
    return agent_b

agent_a = Agent(
    name = "Agent A",
    model = model,
    instructions = "You are a helpful agent.",
    functions=[transfer_to_agent_b],
)

agent_b = Agent(
    name = "Agent B",
    model = model,
    instructions = "Only speak in Haikus.",
)

response = client.run(
    agent = agent_a,
    messages = [{"role": "user", "content": "I want to talk to agent B."}],
)

print(response.messages[-1]["content"])
Enter fullscreen mode Exit fullscreen mode

In the code, we've:

  1. Changed client = Swarm() to client = Swarm(ollama_client)
  2. Added model = model to both agents

And that's it!

Summary

In this short article, we've quickly and easily configured OpenAI's Swarm to run locally using Ollama.

Do your career a big favor. Join DEV. (The website you're on right now)

It takes one minute, it's free, and is worth it for your career.

Get started

Community matters

Top comments (0)

👋 Kindness is contagious

Explore a sea of insights with this enlightening post, highly esteemed within the nurturing DEV Community. Coders of all stripes are invited to participate and contribute to our shared knowledge.

Expressing gratitude with a simple "thank you" can make a big impact. Leave your thanks in the comments!

On DEV, exchanging ideas smooths our way and strengthens our community bonds. Found this useful? A quick note of thanks to the author can mean a lot.

Okay