The OpenAPI to Model Context Protocol (MCP) Proxy is a powerful tool designed to bridge the gap between AI agents and external APIs. By dynamically translating OpenAPI specifications into standardized MCP tools, resources, and prompts, it eliminates the need for custom API wrappers, making integration seamless and efficient.
stdio
, working out-of-the-box with popular LLM orchestrators.git clone https://github.com/gujord/OpenAPI-MCP.git
cd OpenAPI-MCP
pip install -r requirements.txt
For Claude Desktop, Cursor, and Windsurf, use the snippet below and adapt the paths accordingly:
{
"mcpServers": {
"petstore3": {
"command": "full_path_to_openapi_mcp/venv/bin/python",
"args": ["full_path_to_openapi_mcp/src/server.py"],
"env": {
"SERVER_NAME": "petstore3",
"OPENAPI_URL": "https://petstore3.swagger.io/api/v3/openapi.json"
},
"transport": "stdio"
}
}
}
Apply this configuration to the following files:
~/.cursor/mcp.json
~/.codeium/windsurf/mcp_config.json
~/Library/Application Support/Claude/claude_desktop_config.json
Replace
full_path_to_openapi_mcp
with your actual installation path.
Variable | Description | Required | Default |
---|---|---|---|
OPENAPI_URL |
URL to the OpenAPI specification | Yes | - |
SERVER_NAME |
MCP server name | No | openapi_proxy_server |
OAUTH_CLIENT_ID |
OAuth client ID | No | - |
OAUTH_CLIENT_SECRET |
OAuth client secret | No | - |
OAUTH_TOKEN_URL |
OAuth token endpoint URL | No | - |
OAUTH_SCOPE |
OAuth scope | No | api |
httpx
and PyYAML
if needed./resource/{name}
).sequenceDiagram
participant LLM as LLM (Claude/GPT)
participant MCP as OpenAPI-MCP Proxy
participant API as External API
Note over LLM, API: Communication Process
LLM->>MCP: 1. Initialize (initialize)
MCP-->>LLM: Metadata, tools, resources, and prompts
LLM->>MCP: 2. Request tools (tools_list)
MCP-->>LLM: Detailed list of tools, resources, and prompts
LLM->>MCP: 3. Call tool (tools_call)
alt With OAuth2
MCP->>API: Request OAuth2 token
API-->>MCP: Access Token
end
MCP->>API: 4. Execute API call with proper formatting
API-->>MCP: 5. API response (JSON)
alt Type Conversion
MCP->>MCP: 6. Convert parameters to correct data types
end
MCP-->>LLM: 7. Formatted response from API
alt Dry Run Mode
LLM->>MCP: Call with dry_run=true
MCP-->>LLM: Display request information without executing call
end
In addition to tools, the proxy server automatically registers:
/resource/{name}
) for structured data handling.If you find it useful, please give it a ⭐ on GitHub!
The OpenAPI-MCP proxy translates OpenAPI specs into MCP tools, enabling AI agents to access external APIs without custom wrappers!
python ai dynamic mcp openapi gemini openai agents claude fastapi llm claude-desktop mcp-s api-to-mcp api2mcp