MCP Servers

MCP Servers Page 30 of 39

All MCP Servers Complete list of MCP server implementations, sorted by stars

alcova_ai_perplexity_mcp
2

The Perplexity MCP Server acts as a bridge between AI assistants (such as Claude Code and Claude Desktop) and the Perplexity API, providing enhanced search and reasoning capabilities. It allows AI assistants to retrieve real-time information from the web using Perplexity's Sonar Pro model and perform complex reasoning tasks with the Sonar Reasoning Pro model. This integration ensures a seamless experience for users by enabling direct access to these features within the AI assistant's interface.

mcp_server model_context_protocol go claude perplexity search api_integration reasoning
lupuletic_onyx_mcp_server
2

The Onyx MCP Server enables seamless integration between MCP-compatible clients and Onyx AI knowledge bases. It provides enhanced semantic search, context window retrieval, and chat integration, allowing users to search and retrieve relevant context from documents. The server supports configurable document set filtering, full document retrieval, and chat sessions for comprehensive answers.

mcp_server model_context_protocol typescript onyx claude search api_integration docker semantic_search chat_integration
buhe_mcp_rss
buhe_mcp_rss by buhe
2

MCP RSS is a server implementation of the Model Context Protocol (MCP) designed to interact with RSS feeds. It allows users to parse OPML files to import RSS feed subscriptions, automatically fetch and update articles, and expose RSS content through an MCP API. Features include marking articles as favorites, filtering by source and status, and integration with MySQL for data storage. It is built using TypeScript and JavaScript, and can be configured via environment variables.

mcp_server model_context_protocol rss typescript javascript api_integration mysql docker search opml
swatdong_mcp_sample_vsc_debug

This repository provides a sample implementation of an MCP server, designed for debugging in Visual Studio Code. It includes examples in both Python and TypeScript, allowing developers to test and debug MCP tools directly in their browser using the MCP Inspector. The project is ideal for those looking to understand and experiment with MCP server development in a controlled, debuggable environment.

mcp_server model_context_protocol python typescript vscode debugging api_integration
zed_extensions_mcp_server_axiom
2

This project provides a Zed extension for the Axiom MCP server, allowing users to configure and integrate Axiom's API within Zed. It includes support for setting up API tokens and configuring the server via a `config.txt` file, enhancing Zed's functionality with Axiom's capabilities.

mcp_server model_context_protocol rust api_integration zed axiom
spences10_mcp_turso_cloud
2

This MCP server provides seamless integration between Turso databases and Large Language Models (LLMs). It features a two-level authentication system that handles both organization-level and database-level operations, enabling efficient management and querying of Turso databases directly from LLMs. The server supports various operations, including database creation, deletion, and vector search, making it a powerful tool for managing database interactions in AI workflows.

mcp_server model_context_protocol turso llms vector_search authentication typescript javascript api_integration
djbuildit_cursor_claude_think_mcp

This MCP server integrates with Cursor to activate Claude's explicit thinking mode, allowing users to see detailed reasoning processes for their queries. It uses the Model Context Protocol to intercept and format queries with special tags, triggering Claude's reasoning mode. The tool is designed for developers who want to understand Claude's thought process in problem-solving, mathematical proofs, and code analysis.

mcp_server model_context_protocol claude cursor javascript nodejs search api_integration
georgi_terziyski_mcp_client_ollama
2

The Ollama MCP Client is designed to work with various language models such as Qwen, Llama 3, Mistral, and Gemini, served via Ollama. It supports real-time streaming of LLM responses and integrates seamlessly with the Database MCP Server, enabling natural language interactions with databases. This client is ideal for developers looking to leverage the power of language models for database operations and natural language queries.

mcp_server model_context_protocol python ollama language_models streaming database_integration natural_language_interface
zippland_weather_mcp
zippland_weather_mcp by Zippland
2

The OpenWeather MCP Server is a Model Control Protocol (MCP) server designed to provide global weather forecasts and current weather conditions. It integrates with the OpenWeatherMap API to fetch detailed weather information, including temperature, humidity, wind speed, and more. The server supports querying weather conditions anywhere in the world and can be easily configured using environment variables or direct API key parameters. It is designed to work seamlessly with MCP-supported clients, making it a versatile tool for weather-related applications.

mcp_server model_context_protocol python weather_forecast api_integration openweathermap
crazyrabbitltc_mcp_brex_server
crazyrabbitltc_mcp_brex_server by crazyrabbitLTC
2

This MCP server provides a bridge between AI agents and the Brex financial platform, allowing agents to retrieve account information, access expense data, manage budget resources, and view team information. It implements standardized resource handlers and tools following the MCP specification, enabling secure and efficient access to financial data. The server supports features like receipt management, expense tracking, and budget monitoring, making it a powerful tool for AI-driven financial operations.

mcp_server model_context_protocol typescript javascript brex_api api_integration financial_data claude
blankcut_kubernetes_mcp_server

The Claude Kubernetes MCP Server is a Go-based implementation designed to orchestrate Kubernetes workloads using Claude AI, ArgoCD, GitLab, and Vault. It provides a REST API for programmatic interaction with these systems, enabling advanced automation and control of Kubernetes environments. The server supports local development, Docker deployment, and production-ready Helm chart deployment for Kubernetes clusters.

mcp_server model_context_protocol go kubernetes claude argocd gitlab docker api_integration automation
antonorlov_mcp_postgres_server

The MCP PostgreSQL Server provides a standardized interface for AI models to perform database operations on PostgreSQL. It supports secure connections, prepared statements, and comprehensive error handling, making it easier for AI models to query and manage PostgreSQL databases. Features include automatic connection management, support for PostgreSQL-specific syntax, and TypeScript integration.

mcp_server model_context_protocol postgresql javascript api_integration database typescript
b12io_b12_mcp_server

The B12 MCP Server is designed to facilitate the generation of websites using AI, implementing the model context protocol to streamline the process. It provides a robust framework for integrating AI-driven web design, enabling efficient and scalable website creation. This server is particularly useful for automating and enhancing the web development workflow.

mcp_server model_context_protocol ai website_generation web_development
xing5_mcp_google_sheets

This MCP server integrates with Google Drive and Google Sheets, enabling users to create, modify, and manage spreadsheets programmatically. It supports both service account and OAuth 2.0 authentication methods, making it suitable for both automated and interactive use cases. The server provides tools for retrieving sheet data, updating cells, batch updates, listing sheets, and creating new spreadsheets or sheets.

mcp_server model_context_protocol python google_sheets api_integration google_drive automation
rajyraman_genai_mcp
rajyraman_genai_mcp by rajyraman
2

This repository showcases the MCP Server functionalities of GenAIScript, a framework designed to facilitate communication with AI models, including local ones. It provides a standardized way to connect AI models to various data sources and tools, similar to a USB-C port for AI applications. The demo includes configuration examples for setting up the MCP server in VSCode and integrating it with GitHub Copilot.

mcp_server model_context_protocol genaiscript typescript ai_models vscode api_integration
dreamcenter_springboot_mcpserver_junit

This project provides a Springboot-based template for developing MCP servers, supporting both STDIO and Server-Sent Events (SSE) modes. It includes JUnit for unit testing and offers flexibility in configuring message and SSE endpoints. The template is designed to streamline the development of MCP-compliant servers, making it easier to integrate with various AI tools and services.

mcp_server model_context_protocol springboot junit sse stdio java api_integration
pblittle_lnd_mcp_server
2

The LND MCP Server connects to Lightning Network nodes, allowing users to query channel information and other node data using natural language. It provides structured JSON responses alongside human-readable answers, integrates with MCP-supporting LLMs, and supports secure connections via TLS certificates and macaroons. It also includes a mock mode for development without a real LND node.

mcp_server model_context_protocol lightning_network lnd typescript javascript api_integration search
seanivore_the_pensieve
seanivore_the_pensieve by seanivore
2

The Pensieve MCP Server is a TypeScript-based implementation of a RAG (Retrieval-Augmented Generation) knowledge management system. It allows users to store and query knowledge using natural language, with LLM-powered analysis and response synthesis. Features include memory-based URIs, markdown file storage, and tools like `ask_pensieve` for contextual answers based on stored knowledge.

mcp_server model_context_protocol typescript javascript rag ai_memory claude search api_integration