LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to:
This enables you to leverage your own locally running models through Claude's interface, combining Claude's capabilities with your private models.
Clone this repository:
shell
git clone https://github.com/infinitimeless/LMStudio-MCP.git
cd LMStudio-MCP
Install the required packages:
shell
pip install requests "mcp[cli]" openai
For Claude to connect to this bridge, you need to configure the MCP settings properly. You can either:
Use directly from GitHub:
json
{
"lmstudio-mcp": {
"command": "uvx",
"args": [\
"https://github.com/infinitimeless/LMStudio-MCP"\
]
}
}
Use local installation:
json
{
"lmstudio-mcp": {
"command": "/bin/bash",
"args": [\
"-c",\
"cd /path/to/LMStudio-MCP && source venv/bin/activate && python lmstudio_bridge.py"\
]
}
}
For detailed MCP configuration instructions, see MCP_CONFIGURATION.md.
If running locally (not using uvx
), run the LMStudio-MCP server:
shell
python lmstudio_bridge.py
In Claude, connect to the MCP server when prompted by selecting "lmstudio-mcp"
The bridge provides the following functions:
health_check()
: Verify if LM Studio API is accessiblelist_models()
: Get a list of all available models in LM Studioget_current_model()
: Identify which model is currently loadedchat_completion(prompt, system_prompt, temperature, max_tokens)
: Generate text from your local modelIf Claude reports 404 errors when trying to connect to LM Studio:
If certain models don't work correctly:
For more detailed troubleshooting help, see TROUBLESHOOTING.md.
MIT
This project was originally developed as "Claude-LMStudio-Bridge_V2" and has been renamed and open-sourced as "LMStudio-MCP".