This MCP server facilitates interaction between AI assistants like Claude and the YouTube API, allowing for video and channel searches, detailed information retrieval, and integration with Claude Desktop. It serves as a bridge for AI models to access and utilize YouTube data in a standardized manner, enhancing their capabilities with up-to-date and relevant content.
The Google Calendar AutoAuth MCP Server is a Model Context Protocol (MCP) implementation designed to integrate Google Calendar with AI assistants like Claude Desktop. It supports auto authentication, allowing users to manage calendar events through natural language interactions. Features include creating, updating, deleting, and retrieving events, as well as listing and searching events within specified time ranges. The server supports natural language date/time input and integrates seamlessly with the Google Calendar API.
This MCP server integrates with Readwise Reader to expose documents as resources, enabling seamless access and management of your saved content. It provides a structured way to interact with your Readwise Reader data programmatically, enhancing productivity and integration capabilities.
This project implements a Model Context Protocol (MCP) server designed to fetch and aggregate trending topics from multiple platforms, including Sina Weibo and Zhihu. It provides a structured way to retrieve and manage hot topics, making it easier to integrate with other systems or applications. The server is built using Python and follows a modular architecture for adding new platforms.
This MCP server facilitates seamless interaction between Claude AI and MySQL databases, allowing for database connections, query execution, and data retrieval in JSON format. It supports configuration via environment variables and includes tools for connecting to databases and executing SQL queries. The server is designed for easy integration with Claude AI and can be installed using pip or uv.
The Kaltura MCP Server implements the Model Context Protocol (MCP) to provide AI models with access to Kaltura's media management capabilities. It allows AI assistants to upload, retrieve, search, and manage media content through a standardized interface. The server supports Docker, Python 3.10+, and includes comprehensive documentation and example clients for easy integration.
Tiny TODO MCP is a specialized server that implements the Model Context Protocol (MCP), enabling AI assistants to interact with persistent storage for tasks. It allows AI models to maintain context over time, create, update, delete, and search tasks, and manage tasks with due dates and completion status. The server uses a SQLite database and follows a clean layered architecture for easy integration with AI assistants.
The LI.FI Cross-Chain Swap MCP Server integrates with the LI.FI API to facilitate seamless cross-chain token swaps across multiple liquidity pools and bridges. It supports operations like fetching token and chain information, executing cross-chain transactions, and managing wallet interactions. This server is designed for developers and users looking to leverage decentralized finance (DeFi) for efficient cross-chain asset transfers.
The MCP Tools CLI is a command-line interface designed to interact with Model Context Protocol (MCP) servers. It allows users to list available tools and call specific tools on MCP servers using a configuration file. The tool is particularly useful for developers working with MCP servers, providing a streamlined way to manage and execute server commands from the terminal.
The Google Scholar MCP Server is a Python-based implementation of the Model Context Protocol (MCP) designed to integrate Google Custom Search functionality into applications like Claude Desktop. It allows users to search academic content through Google Scholar by leveraging the Google Custom Search API. The server is containerized using Docker and can be installed manually or via Smithery for seamless integration.
This project provides a reproducible demo that highlights the limitations of an MCP SSE server when using the bun runtime, as opposed to nodejs. It includes steps to install dependencies, run the MCP inspector, and observe the differences in server behavior between bun and nodejs. The demo helps developers understand the constraints and potential issues when using bun for MCP SSE server implementations.
The MCP Calculate Server is a mathematical computation service that leverages the MCP protocol and the SymPy library to perform advanced symbolic calculations. It supports a wide range of operations, including basic arithmetic, algebraic manipulations, calculus, equation solving, matrix operations, and series expansions. This server is ideal for applications requiring precise and complex mathematical computations, offering a robust API for integration with other systems.
The OpenSCAD MCP Server enables users to create 3D models from text descriptions or images, leveraging AI image generation, multi-view reconstruction, and OpenSCAD integration. It supports remote processing for computationally intensive tasks, image approval workflows, and parametric model export in various formats. The server is designed for users who need to generate and refine 3D models efficiently.
The PostgreSQL MCP Server facilitates interaction between AI assistants and PostgreSQL databases by providing tools to execute SQL queries, create tables, and list database tables. It supports features like read_query, write_query, create_table, and list_tables, making it a powerful backend for AI-driven database operations. The server is built with Go and integrates seamlessly with MCP-enabled AI systems.
The Cohere MCP Server is an implementation of the Model Context Protocol (MCP) designed to integrate seamlessly with Cohere's API. It enables developers to leverage cutting-edge multilingual models and advanced retrieval capabilities in their applications. The server provides tools and resources to enhance the use of generative AI and multilingual models, making it a valuable addition for AI-driven projects.