MCP Servers

MCP Servers Page 22 of 39

All MCP Servers Complete list of MCP server implementations, sorted by stars

timmy9527_agentziwei
timmy9527_agentziwei by Timmy9527
3

Ziwei AI is an innovative application that leverages the wisdom of traditional Chinese culture, specifically 'Ziwei Dou Shu,' to help users understand themselves and improve life efficiency. It integrates traditional cultural knowledge with modern AI technology, offering features like charting, chart analysis, and interactive reports. The application aims to provide a professional yet accessible tool for users to explore their personality traits, natural talents, and life trends through Ziwei Doushu charts.

mcp_server model_context_protocol ai ziwei_doushu chart_analysis cultural_inheritance big_data
seriallazer_ibkr_mcp_server
3

mcp_server model_context_protocol
daniboycg_mcp_servers
daniboycg_mcp_servers by Daniboycg
3

This repository hosts configurations and scripts for various MCP (Model Context Protocol) servers, facilitating the integration of external tools with language models such as Claude in Cursor. It includes setups for Firecrawl, Browser Tools, Supabase, Git, and more, enhancing the capabilities of AI-driven workflows.

mcp_server model_context_protocol python javascript typescript docker api_integration search claude perplexity firecrawl
douglarek_unsplash_mcp_server

This project is an MCP server designed to facilitate Unsplash image search integration. It uses the mark3labs/mcp-go library to provide a streamlined interface for searching and retrieving images from Unsplash. The server can be integrated into applications like Cursor, making it easy to add image search capabilities to your projects.

mcp_server model_context_protocol go unsplash search api_integration
wrediam_n8n_coolify_mcp_tools

This project leverages the Community n8n MCP Client and the Coolify MCP Server to enable seamless interaction with Coolify infrastructure using the Model Context Protocol (MCP). It provides tools for managing teams, servers, services, applications, and deployments, streamlining infrastructure management with AI-driven automation. The Coolify MCP Server acts as a bridge between Coolify and n8n, facilitating efficient API communication and scalable automation.

mcp_server model_context_protocol n8n coolify api_integration automation infrastructure_management
radial_hks_mcp_unreal_server

This MCP server enables interaction with Unreal Engine instances by providing remote Python execution capabilities. It features Unreal instance management, remote execution modes, and detailed logging and monitoring. The server supports automatic discovery of Unreal nodes, real-time status monitoring, and execution of Python code in both attended and unattended modes.

mcp_server model_context_protocol python unreal_engine remote_execution api_integration logging monitoring
tmstack_mcp_servers_hub

The MCP Servers and Clients Hub is a centralized repository for discovering and contributing to a wide range of Model Context Protocol (MCP) servers and clients. It provides a comprehensive list of MCP implementations, tools, and integrations across different domains such as AI services, system automation, search, and more. This hub aims to facilitate the development and adoption of MCP by offering a collaborative platform for developers and users.

mcp_server model_context_protocol javascript typescript python rust go docker api_integration search
hajime_y_deep_research_mcp

The Deep Research MCP Server is an agent-based tool designed to provide advanced research capabilities, including web search, PDF and document analysis, image analysis, YouTube transcript retrieval, and archive site search. It leverages HuggingFace's smolagents and requires Python 3.11 or higher, along with API keys for OpenAI, HuggingFace, and SerpAPI. The server is implemented as an MCP server, offering a robust solution for complex research tasks.

mcp_server model_context_protocol python huggingface web_search api_integration document_analysis image_analysis
studentofjs_mcp_figma_to_react

The MCP Figma to React Converter is a server that automates the process of converting Figma designs into React components. It leverages the Figma API to fetch designs, extracts components, and generates React code with TypeScript and Tailwind CSS. The server supports various workflows, including fetching Figma projects, extracting components, and generating React component libraries. It also provides tools for enhancing components with accessibility features and supports both stdio and SSE transports for flexibility in deployment.

mcp_server model_context_protocol figma react typescript tailwind_css api_integration component_generation
dubin555_clickhouse_mcp_server

The ClickHouse MCP Server provides AI assistants with a secure and structured way to explore and analyze databases. It enables them to list tables, read data, and execute SQL queries through a controlled interface, ensuring responsible database access. The server can be configured via environment variables or command-line arguments, and it integrates with tools like VSCode and Cline for seamless usage.

mcp_server model_context_protocol clickhouse python database ai_assistant api_integration sql
aasherkamal216_axiom
aasherkamal216_axiom by aasherkamal216
3

Axiom is an AI agent specialized in modern AI frameworks, libraries, and tools. It helps users create AI agents, RAG systems, chatbots, and full-stack development projects through natural language instructions. Built with LangGraph, MCP Docs, Chainlit, and Gemini models, it offers an interactive chat interface, access to multiple documentation sources, and customizable model settings. It also supports Docker for containerized deployment.

mcp_server model_context_protocol python docker langgraph chainlit gemini ai_agent api_integration
brandon_powers_mcp_kafka
brandon_powers_mcp_kafka by brandon-powers
3

This MCP server facilitates reliable interactions between language models (LLM/SLM) and Apache Kafka, including its ecosystem tools like Kafka Connect, Burrow, and Cruise Control. It supports core Kafka APIs, excluding Streams, and provides REST API integrations for Burrow and Cruise Control. The server is designed to enhance the capabilities of language models by enabling them to perform tasks such as consuming, producing, and describing Kafka clusters, topics, and consumer groups.

mcp_server model_context_protocol python kafka api_integration docker kafka_connect burrow cruise_control
epinault_elixir_mcp_server

This project is an Elixir-based implementation of the Model Context Protocol (MCP) server, designed to enable secure interactions between AI models and local or remote resources. It uses Server-Sent Events (SSE) as the transport protocol and includes tools like file listing, message echoing, and weather data retrieval. The server is built with Bandit and Plug, providing a lightweight and efficient solution for MCP-compliant applications.

mcp_server model_context_protocol elixir sse bandit plug api_integration
tuskermanshu_swagger_mcp_server

The Swagger MCP Server is designed to parse Swagger/OpenAPI documents, supporting both v2 and v3 specifications. It generates TypeScript type definitions and API client code for frameworks like Axios, Fetch, and React Query. The server integrates with the Model Context Protocol (MCP), enabling seamless integration with large language models. It also features optimized processing for large API documents, including caching, lazy loading, and incremental parsing.

mcp_server model_context_protocol typescript javascript api_integration docker swagger openapi code_generation
shogo_ma_docbase_mcp_server

The Docbase MCP Server is a Model Context Protocol (MCP) implementation designed to interact with Docbase, enabling users to search, retrieve, and create posts programmatically. It provides a command-line interface for building and running the server, with configuration options for custom API domains and tokens. This server simplifies Docbase integration into workflows requiring automated post management.

mcp_server model_context_protocol go api_integration docbase search
ajeetraina_docker_mcp_portal

mcp_server model_context_protocol
beverm2391_chain_of_thought_mcp_server

The Chain of Thought MCP Server leverages Groq's API to call LLMs, exposing raw chain-of-thought tokens from Qwen's qwq model. It is designed to enhance AI performance by enabling structured reasoning and verification steps, particularly in complex tool use scenarios. The server integrates seamlessly with MCP configurations, allowing AI agents to iteratively refine responses and improve decision-making processes.

mcp_server model_context_protocol python groq_api llm chain_of_thought reasoning_model api_integration
guoling2008_go_mcp_postgres
3

This project provides a lightweight, zero-burden MCP server designed to interact with PostgreSQL databases. It supports CRUD operations, schema management, and automation tools, eliminating the need for Node.js or Python environments. The server includes features like read-only mode, query plan checking, and SSE (Server-Sent Events) support, making it a versatile solution for database integration and automation tasks.

mcp_server model_context_protocol postgresql go database_integration automation crud_operations sse