A Model Context Protocol (MCP) client for language models served through Ollama.
# Clone the repository
git clone <repository-url>
# Install the package
pip install -e .
# Basic usage
mcp-ollama --model llama3 --server http://localhost:8000 "List all tables in my database"
# Specify a different Ollama server
mcp-ollama --model qwen --ollama-api http://localhost:11434 --server http://localhost:8000 "Show me the schema for the users table"
The client can be configured using command line arguments or a configuration file:
--model
: The Ollama model to use (e.g., llama3, qwen, mistral, gemini)--server
: URL of the Database MCP server--ollama-api
: URL of the Ollama API server (default: http://localhost:11434
)--stream
: Enable/disable streaming responses (default: true)--config
: Path to configuration fileCreate a config.json
file in your home directory:
{
"model": "llama3",
"server": "http://localhost:8000",
"ollama_api": "http://localhost:11434",
"stream": true
}
# List tables
mcp-ollama "What tables do I have in my database?"
# Query data
mcp-ollama "Show me all users who registered in the last month"
# Create a table
mcp-ollama "Create a new table called products with columns for id, name, price, and description"
MIT