MCPOmni Connect is a powerful, universal command-line interface (CLI) that serves as your gateway to the Model Context Protocol (MCP) ecosystem. It seamlessly integrates multiple MCP servers, AI models, and various transport protocols into a unified, intelligent interface.
MCPOmni Connect
โโโ Transport Layer
โ โโโ Stdio Transport
โ โโโ SSE Transport
โ โโโ Docker Integration
โโโ Session Management
โ โโโ Multi-Server Orchestration
โ โโโ Connection Lifecycle Management
โโโ Tool Management
โ โโโ Dynamic Tool Discovery
โ โโโ Cross-Server Tool Routing
โ โโโ Tool Execution Engine
โโโ AI Integration
โโโ LLM Processing
โโโ Context Management
โโโ Response Generation
# with uv recommended
uv add mcpomni-connect
# using pip
pip install mcpomni-connect
# start the cli running the command ensure your api key is export or create .env
mcpomni_connect
# Run all tests with verbose output
pytest tests/ -v
# Run specific test file
pytest tests/test_specific_file.py -v
# Run tests with coverage report
pytest tests/ --cov=src --cov-report=term-missing
tests/
โโโ unit/ # Unit tests for individual components
Installation
```shell
# Clone the repository
git clone https://github.com/Abiorh001/mcp_omni_connect.git
cd mcp_omni_connect
uv venv
source .venv/bin/activate
uv sync
```
Configuration
```shell
# Set up environment variables
echo "LLM_API_KEY=your_key_here" > .env
```
Start Client
shell
# Start the client
uv run src/main.py pr python src/main.py
{
"LLM": {
"provider": "openai", // Supports: "openai", "openrouter", "groq"
"model": "gpt-4", // Any model from supported providers
"temperature": 0.5,
"max_tokens": 5000,
"top_p": 0
},
"mcpServers": {
"filesystem-server": {
"command": "npx",
"args": [\
"@modelcontextprotocol/server-filesystem",\
"/path/to/files"\
]
},
"sse-server": {
"type": "sse",
"url": "http://localhost:3000/mcp",
"headers": {
"Authorization": "Bearer token"
},
},
"docker-server": {
"command": "docker",
"args": ["run", "-i", "--rm", "mcp/server"]
}
}
}
/tools
- List all available tools across servers/prompts
- View available prompts/prompt:<name>/<args>
- Execute a prompt with arguments
```shell
# Example: Weather prompt
/prompt:weather/location=tokyo/units=metric
/prompt:weather/{"location":"tokyo","units":"metric"}
``
-
/resources- List available resources
-
/resource:- Access and analyze a resource
-
/debug- Toggle debug mode
-
/refresh` - Update server capabilities
# List all available prompts
/prompts
# Basic prompt usage
/prompt:weather/location=tokyo
# Prompt with multiple arguments depends on the server prompt arguments requirements
/prompt:travel-planner/from=london/to=paris/date=2024-03-25
# JSON format for complex arguments
/prompt:analyze-data/{
"dataset": "sales_2024",
"metrics": ["revenue", "growth"],
"filters": {
"region": "europe",
"period": "q1"
}
}
# Nested argument structures
/prompt:market-research/target=smartphones/criteria={
"price_range": {"min": 500, "max": 1000},
"features": ["5G", "wireless-charging"],
"markets": ["US", "EU", "Asia"]
}
The client intelligently:
- Chains multiple tools together
- Provides context-aware responses
- Automatically selects appropriate tools
- Handles errors gracefully
- Maintains conversation context
# Example of automatic tool chaining if the tool is available in the servers connected
User: "Find charging stations near Silicon Valley and check their current status"
# Client automatically:
1. Uses Google Maps API to locate Silicon Valley
2. Searches for charging stations in the area
3. Checks station status through EV network API
4. Formats and presents results
# Automatic resource processing
User: "Analyze the contents of /path/to/document.pdf"
# Client automatically:
1. Identifies resource type
2. Extracts content
3. Processes through LLM
4. Provides intelligent summary
We welcome contributions! See our Contributing Guide for details.
This project is licensed under the MIT License - see the LICENSE file for details.
Built with โค๏ธ by the MCPOmni Connect Team