virajsharma2000_simple_mcp_ollama_bridge

virajsharma2000_simple_mcp_ollama_bridge

by virajsharma2000
A bridge connecting Ollama to a Model Context Protocol (MCP) server for OpenAI-compatible LLMs.

Ollama to MCP Server Bridge

A bridge connecting Model Context Protocol (MCP) servers to OpenAI-compatible LLMs like Ollama.

Overview

The Ollama to MCP Server Bridge is a simple yet powerful tool designed to connect Model Context Protocol (MCP) servers with OpenAI-compatible Large Language Models (LLMs) such as Ollama. This bridge facilitates seamless communication between MCP servers and LLMs, enabling users to leverage the capabilities of both systems effectively.

Quick Start

To get started with the Ollama to MCP Server Bridge, follow these steps:

# Install
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/bartolli/mcp-llm-bridge.git
cd mcp-llm-bridge
uv venv
source .venv/bin/activate
uv pip install -e .

Note: Reactivate the environment if needed to use the keys in .env: source .venv/bin/activate

Then configure the bridge in src/mcp_llm_bridge/main.py:

mcp_server_params=StdioServerParameters(
    command="uv",
    # CHANGE THIS = it needs to be an absolute directory! add the mcp fetch server at the directory (clone from https://github.com/modelcontextprotocol/servers/)
    args=["--directory", "~/llms/mcp/mc-server-fetch/servers/src/fetch", "run", "mcp-server-fetch"],
    env=None
),
# llm_config=LLMConfig(
#     api_key=os.getenv("OPENAI_API_KEY"),
#     model=os.getenv("OPENAI_MODEL", "gpt-4o"),
#     base_url=None
# ),
llm_config=LLMConfig(
    api_key="ollama",  # Can be any string for local testing
    model="llama3.2",
    base_url="http://localhost:11434/v1"  # Point to your local model's endpoint
),

Additional Endpoint Support

The bridge also works with any endpoint implementing the OpenAI API specification:

Ollama

llm_config=LLMConfig(
    api_key="not-needed",
    model="mistral-nemo:12b-instruct-2407-q8_0",
    base_url="http://localhost:11434/v1"
)

License

This project is licensed under the MIT License.

Contributing

PRs are welcome. Feel free to contribute to the project by submitting pull requests.

About

A Simple bridge from Ollama to a fetch url mcp server.

Resources

Activity

Stars

Watchers

Forks

Report repository

Releases

No releases published.

Packages

No packages published.

Languages

Features & Capabilities

Categories
mcp_server model_context_protocol python ollama api_integration llm

Implementation Details

Stats

0 Views
0 Favorites
1 GitHub Stars

Repository Info

virajsharma2000 Organization

Similar Servers

continuedev_continue by continuedev
0
0
0