The OceanBase MCP Server facilitates secure and efficient communication with OceanBase databases using the Model Context Protocol (MCP). It provides a robust interface for managing database interactions, ensuring data integrity and security. This server is designed to support various tools and applications that rely on OceanBase, making it a critical component for database management and integration.
WebSearch-MCP is a Model Context Protocol server designed to enable real-time web search functionality for AI assistants such as Claude. It integrates with a WebSearch Crawler API to retrieve up-to-date search results, allowing AI models to access current information on any topic. The server supports configuration via environment variables and can be easily integrated with various MCP clients.
The Fetch MCP Server is designed to retrieve and process web content, even from pages requiring JavaScript rendering or using anti-scraping techniques. It leverages browser automation, OCR, and multiple extraction methods to ensure high-quality content retrieval. The server includes a sophisticated scoring system to select the best results, making it ideal for integrating with LLMs for web content processing.
The ArgoCD MCP Server integrates with the ArgoCD API, allowing AI assistants and large language models to manage ArgoCD applications and resources through natural language. It provides features like application management, authentication, and robust API client handling, enhancing developer experience with static type checking and detailed documentation.
The Calculator MCP Server is designed to integrate with LLMs, providing a dedicated tool for precise numerical calculations. It supports a `calculate` function that evaluates mathematical expressions, ensuring accurate results. The server can be installed using `uv` or `pip` and configured to work seamlessly with MCP clients.
The Notion Workspace Integration MCP Server provides a standardized interface for AI models like Claude to interact with Notion workspaces. It enables querying databases, retrieving page content, and managing updates. The server supports integration with Claude for Desktop, allowing users to perform natural language queries and manage tasks directly from their Notion workspace.
This MCP server provides audio transcription capabilities by leveraging OpenAI's Whisper API. It allows users to transcribe audio files into text, with optional features like saving the transcription to a file and specifying the language. The server is easy to set up and integrates seamlessly with OpenAI's API for accurate and efficient speech-to-text conversion.
This MCP server provides a comprehensive interface for interacting with HubSpot CRM, enabling seamless management of CRM objects like companies, contacts, and deals. It supports advanced features such as batch operations, property validation, and type-safe parameter validation using Zod. Designed for efficiency, it simplifies CRM data handling and integration workflows.
The Starknet MCP Server provides AI agents with tools to interact with the Starknet blockchain, including querying blockchain data, managing wallets, and interacting with smart contracts. It supports both Mainnet and Sepolia testnet, integrates with StarknetID, and offers features like native token operations, NFT management, and smart contract interactions. Designed for use with AI assistants like Claude and GPT, it simplifies blockchain interactions through a consistent MCP interface.
The Open Docs MCP Server is an open-source implementation of the Model Context Protocol (MCP) designed for document management. It provides features such as crawling and indexing documentation from various sources, supporting multiple document formats, and offering full-text search capabilities. The server integrates with tools like Claude Desktop and includes functionalities like enabling/disabling document crawling, building search indexes, and managing custom documentation sources. It is built with Node.js and TypeScript, making it a versatile solution for developers needing efficient document handling and search integration.
The Pinax MCP Client facilitates seamless integration with MCP Server-compatible services hosted at The Graph Market. It provides tools for authentication, server-sent events, and verbose logging, enabling developers to interact efficiently with the platform. Supported by AI agents like Claude Desktop, Cline, and Cursor, it enhances the functionality of AI-driven applications.
The Bocha AI Web Search MCP Server is designed to provide powerful search capabilities for Chinese internet content, adhering to PRC regulations. It supports flexible time ranges, smart summaries, and dual output formats (Markdown and JSON) for both human-readable and programmatic use. The server includes robust error handling and is built to integrate seamlessly with Bocha AI's services.
The Neo N3 MCP Server provides a robust interface for interacting with the Neo N3 blockchain, supporting dual-network integration (mainnet and testnet). It enables users to query blockchain data, manage wallets, transfer assets, and invoke smart contracts. Designed for secure and efficient operation, it includes features like transaction monitoring, gas fee estimation, and support for famous Neo N3 contracts such as NeoFS, NeoBurger, and Flamingo. The server is deployable via Docker or NPM, ensuring ease of integration with existing systems.
This MCP server is designed to facilitate code reviews by integrating with Large Language Models (LLMs) such as OpenAI, Anthropic, and Gemini. It uses Repomix to flatten codebases, analyzes code with LLMs, and provides structured reviews with specific issues and recommendations. The server supports chunking for large codebases and offers tools for both high-level overviews and detailed code quality assessments.
This MCP server is designed to interact with Sina Weibo, enabling users to fetch detailed user profiles, dynamic content, and perform user searches. It leverages the Model Context Protocol to provide a structured and efficient way to access and manage Weibo data. The server is particularly useful for applications requiring integration with Weibo's user data for research or analysis purposes.
5ire is a cross-platform desktop AI assistant that functions as an MCP client, compatible with major AI service providers like OpenAI, Azure, Anthropic, and more. It supports a local knowledge base, enabling robust Retrieval-Augmented Generation (RAG) capabilities, and integrates with tools via the Model Context Protocol (MCP). The assistant also features usage analytics, a prompts library, bookmarks, and quick search functionality for enhanced productivity.