Using FastMCP with OpenAI: Avoiding Session Termination Issues
When integrating FastMCP as a Model Context Protocol (MCP) server for OpenAI’s Responses API, you may encounter a frustrating issue:
OpenAI sends DELETE
requests to end the MCP session after each call, without automatically creating new sessions for subsequent requests. This will result in FastMCP responding:
{
"jsonrpc": "2.0",
"id": "server-error",
"error": {
"code": -32600,
"message": "Not Found: Session has been terminated"
}
}
The Fix: stateless_http=True
When instantiating your FastMCP server, set:
from fastmcp import FastMCP
mcp = FastMCP(
name="MyToolServer",
stateless_http=True # crucial for streaming stability
)
@mcp.tool()
def greet(name: str) -> str:
return f"Hello, {name}!"
if __name__ == "__main__":
mcp.run(
transport="http",
host="0.0.0.0",
port=8000
)
Why This Works
stateless_http=True
disables stateful session tracking.
Every request is handled independently, so OpenAI’s DELETE calls don’t break the connection.
Without it, streaming sessions will be terminated prematurely.
More details: OpenAI Forum Thread
Caveats
- Session IDs won’t persist between requests, so advanced workflows relying on session state won’t work.
- Sampling and certain bi-directional streaming patterns may not be supported in stateless mode.
TL;DR
If your OpenAI + FastMCP streaming requests fail after one call,
set stateless_http=True
when starting the server.
It’s the simplest fix to keep your MCP server ready for every new streaming request.
Top comments (1)
Thanks for this tip! This is more useful to me as a debug tool as my code with errors now properly outputs error messages rather than just "Session Terminated" in the OpenAI Prompt Logs.