conechoai_nchan_mcp_transport

conechoai_nchan_mcp_transport

by ConechoAI
A high-performance WebSocket/SSE transport layer and gateway for Anthropic's Model Context Protocol (MCP).

Nchan MCP Transport for Claude

Overview

Nchan MCP Transport is a high-performance WebSocket/SSE transport layer and gateway for Anthropic's MCP (Model Context Protocol). It is powered by Nginx, Nchan, and FastAPI, making it ideal for building real-time, scalable AI integrations with Claude and other LLM agents.

Key Features

Feature Description
🔄 Dual Protocol Support Seamlessly supports WebSocket and SSE with automatic detection
🚀 High Performance Pub/Sub Built on Nginx + Nchan, handles thousands of concurrent connections
🔌 MCP-Compliant Transport Fully implements Model Context Protocol (JSON-RPC 2.0)
🧰 OpenAPI Integration Auto-generate MCP tools from any OpenAPI spec
🪝 Tool / Resource System Use Python decorators to register tools and resources
📡 Asynchronous Execution Background task queue + live progress updates via push notifications
🧱 Dockerized Deployment Easily spin up with Docker Compose

Why Use This?

MCP allows AI assistants like Claude to communicate with external tools. However, native MCP is HTTP+SSE, which struggles with long tasks, network instability, and high concurrency. Nchan MCP Transport bridges the gap by providing:

  • Web-scale performance (Nginx/Nchan)
  • FastAPI-powered backend for tools
  • Real-time event delivery to Claude clients
  • Plug-and-play OpenAPI to Claude integration

Quickstart

1. Install Server SDK

pip install httmcp

2. Run Demo in Docker

git clone https://github.com/yourusername/nchan-mcp-transport.git
cd nchan-mcp-transport
docker-compose up -d

3. Define Your Tool

@server.tool()
async def search_docs(query: str) -> str:
    return f"Searching for {query}..."

4. Expose OpenAPI Service (Optional)

openapi_server = await OpenAPIMCP.from_openapi("https://example.com/openapi.json", publish_server="http://nchan:80")
app.include_router(openapi_server.router)

5. One-Click GPTs Actions to MCP Deployment

# Installation
pip install httmcp[cli]

# One-click deployment from GPTs Actions OpenAPI spec
python -m httmcp -f gpt_actions_openapi.json -p http://nchan:80

Use Cases

  • Claude plugin server over WebSocket/SSE
  • Real-time LLM agent backend (LangChain/AutoGen style)
  • Connect Claude to internal APIs (via OpenAPI)
  • High-performance tool/service bridge for MCP

Requirements

  • Nginx with Nchan module (pre-installed in Docker image)
  • Python 3.9+
  • Docker / Docker Compose

Tech Stack

  • 🧩 Nginx + Nchan – persistent connection management & pub/sub
  • ⚙️ FastAPI – backend logic & JSON-RPC routing
  • 🐍 HTTMCP SDK – full MCP protocol implementation
  • 🐳 Docker – deployment ready

Contributing

Pull requests are welcome! File issues if you’d like to help improve:

  • Performance
  • Deployment
  • SDK integrations

License

MIT License

About

The best way to deploy MCP server. A high-performance WebSocket/SSE transport layer & gateway for Anthropic's MCP (Model Context Protocol) — powered by Nginx, Nchan, and FastAPI.

Topics

actions
gpts
gpts-actions
mcp-transport
nchan-websocket
sse-for-anthropic
mcp-jsonrpc-gateway
claude-plugin-backend
streamable-http
real-time-ai-api-gateway
fastapi-websocket-mcp
mcp-pubsub
mcp-openapi-bridge

Features & Capabilities

Categories
mcp_server model_context_protocol python typescript javascript claude websocket sse nginx nchan fastapi docker api_integration real-time

Implementation Details

Stats

0 Views
7 GitHub Stars

Repository Info

ConechoAI Organization

Similar MCP Servers

continuedev_continue by continuedev
25049
21423
9300