OpenAI’s function calling (now called tool_calls
) is one of the most powerful ways to extend GPT. It lets the model reason over structured actions and trigger code — giving you the foundation for real AI agents.
But for Ruby developers, using tools meant writing manual JSON schemas, messy parsing, or gluing Python into the mix.
So I built this: openai-toolable
A tiny Ruby gem to make your methods OpenAI-toolable in seconds.
What It Does
openai-toolable
gives you a simple way to:
- Annotate any Ruby method as a tool
- Auto-generate OpenAI-compatible tool schemas
- Handle tool calls from OpenAI response JSON
- Return function results cleanly — no boilerplate
It’s perfect for building:
- AI agents with Ruby backends
- GPT-augmented Rails apps
- CLI tools, Discord bots, copilots — you name it
Quick Example
require "openai/toolable"
require "openai"
# Set your API key
OPENAI_API_KEY = ENV["OPENAI_API_KEY"]
# Define a tool with required parameters
weather_tool = Openai::Toolable::ToolFactory.build(
name: "get_weather",
type: "function",
description: "Get the current weather in a given location",
parameters: [
{ name: "location", type: :string, description: "The city and state, e.g. San Francisco, CA", required: true },
{ name: "unit", type: :string, description: "The unit of temperature, e.g. celsius or fahrenheit", required: true }
]
)
# Create a tool handler and register the tool
tool_handler = Openai::Toolable::ToolHandler.new
tool_handler.register(
name: "get_weather",
lambda: ->(location:, unit:) { puts "Getting the weather in #{location} (#{unit})..." }
)
Now you can generate the tool spec and call the tool from an OpenAI response:
# Create a client
client = OpenAI::Client.new(api_key: OPENAI_API_KEY)
begin
# Create a chat completion
response = client.chat.completions.create(
model: "gpt-4o-mini",
messages: [{ role: "user", content: "What's the weather like in Boston?" }],
tools: [weather_tool.to_json],
tool_choice: "auto"
)
# Handle the response
tool_handler.handle(response: response)
rescue StandardError => e
puts "An error occurred: #{e.message}"
end
It's a full round-trip:
Tool spec → GPT picks it → You call it → Return result.
Features
- Define tools inline with your Ruby logic
- Fully OpenAI-compatible tool schemas
- Works with any GPT-4 / gpt-3.5
tool_calls
- Clean, declarative DSL
- Supports multiple tools per class
Installation
Add it to your Gemfile:
gem "openai-toolable"
Or install manually:
gem install openai-toolable
Full Docs & Usage
All source code, examples, and docs here: https://github.com/vancuren/openai-toolable
Why Use This?
With OpenAI’s tool_calls
, the future is function-aware models.
This gem lets you build GPT agents using just Ruby, without the mess of wiring schemas and handlers manually.
Instead of this:
{
"type": "function",
"function": {
"name": "calculate_something",
"parameters": ...
}
}
You just write a method and mark it tool(...)
.
Simple, readable, and ready to go.
Future Ideas
- Tool chaining for multi-step calls
- Streaming + parallel calls
- Built-in error handling + retry
- Optional Rails generators for service objects
Open to feature suggestions! Drop them in Issues or send a PR.
Help Spread the Word
If you find this useful:
- ⭐ Star the repo
- Share it on Twitter/X
- Mention it in
r/ruby
or dev communities - Link it in your AI projects or Rails agents
Let’s make Ruby a first-class citizen in the AI tooling world
Stay in the Loop
I'm building open-source AI infrastructure and Ruby dev tools.
Follow me on GitHub for updates.
Top comments (0)