DEV Community

Cover image for Using FastMCP with OpenAI and Avoiding Session Termination Issues
Andreas Bergström
Andreas Bergström

Posted on

Using FastMCP with OpenAI and Avoiding Session Termination Issues

Using FastMCP with OpenAI: Avoiding Session Termination Issues

When integrating FastMCP as a Model Context Protocol (MCP) server for OpenAI’s Responses API, you may encounter a frustrating issue:

OpenAI sends DELETE requests to end the MCP session after each call, without automatically creating new sessions for subsequent requests. This will result in FastMCP responding:

{
    "jsonrpc": "2.0",
    "id": "server-error",
    "error": {
        "code": -32600,
        "message": "Not Found: Session has been terminated"
    }
}
Enter fullscreen mode Exit fullscreen mode

The Fix: stateless_http=True

When instantiating your FastMCP server, set:

from fastmcp import FastMCP

mcp = FastMCP(
    name="MyToolServer",
    stateless_http=True  # crucial for streaming stability
)

@mcp.tool()
def greet(name: str) -> str:
    return f"Hello, {name}!"

if __name__ == "__main__":
    mcp.run(
        transport="http",
        host="0.0.0.0",
        port=8000
    )
Enter fullscreen mode Exit fullscreen mode

Why This Works

stateless_http=True disables stateful session tracking.
Every request is handled independently, so OpenAI’s DELETE calls don’t break the connection.
Without it, streaming sessions will be terminated prematurely.
More details: OpenAI Forum Thread

Caveats

  • Session IDs won’t persist between requests, so advanced workflows relying on session state won’t work.
  • Sampling and certain bi-directional streaming patterns may not be supported in stateless mode.

TL;DR

If your OpenAI + FastMCP streaming requests fail after one call,
set stateless_http=True when starting the server.
It’s the simplest fix to keep your MCP server ready for every new streaming request.

Top comments (0)