mcphero: Use MCP servers with native OpenAI clients
I released a small Python package that maps MCP servers to native OpenAI clients.
Pypi: https://pypi.org/project/mcphero/
Github: https://github.com/stepacool/mcphero
The problem
MCP servers are cool, but:
- Native
openai/geminiclients don’t support MCP - Because of that, many projects don’t use MCP at all
- OR projects have to write a lot of custom mapping code
What mcphero does
- Converts MCP tools into OpenAI-compatible
tools/functions - Sends LLM tool-call results back to the MCP server for execution
- Returns updated message history
- Two-way mapping in ~2 lines of code
Minimal example
adapter = MCPToolAdapterOpenAI("https://api.mcphero.app/mcp/your-server-id/mcp")
client = OpenAI()
tools = await adapter.get_tool_definitions()
response = client.chat.completions.create(
model="gpt-5",
messages=[{"role": "user", "content": "What's the weather in London?"}],
tools=tools,
)
tool_calls = response.choices[0].message
result = await adapter.process_tool_calls(tool_calls)
Without get_tool_definitions(), MCP tools can’t be wired into the OpenAI tools argument.
Without process_tool_calls(), openai tool_calls can't be wired into the MCP server.
Feedback is welcome.
If you are looking for a lovable/vercel for MCP servers - try mcphero.app - managed AI-generated MCP servers built for devs.
Top comments (0)