The Uniswap Trader MCP server facilitates AI agents in automating token swaps on the Uniswap decentralized exchange (DEX) across various blockchains. It offers features such as real-time price quotes, swap execution with configurable parameters, and multi-chain support for Ethereum, Optimism, Polygon, and more. This solution streamlines trading operations by optimizing routes, managing slippage, and generating trading suggestions based on liquidity and fees.
This MCP server integrates with Google's Gemini 2 API to generate images based on user prompts. It supports customizable parameters such as aspect ratio, number of samples, and person generation settings. The server is designed to work with the Model Context Protocol (MCP) and can be easily configured and run using npm. It is ideal for applications requiring dynamic image generation.
This project provides a Telegram client library and an MCP (Model Context Protocol) server, allowing AI assistants like Claude to interact with Telegram. The Telegram client library supports authentication, session management, and message retrieval, while the MCP server enables searching channels, listing available channels, and filtering messages by patterns. This integration facilitates seamless access to Telegram data for MCP-compatible assistants.
The Chronicle SecOps MCP Server is designed to integrate with Google's Chronicle Security Operations API, providing functionalities such as searching security events, retrieving alerts, and looking up entity information. It supports integration with Claude Desktop and includes features like customizable queries, entity lookups, and security rule listings. The server is built using Python and can be installed manually or via Smithery for seamless setup.
This repository contains modified versions of various MCP servers, including figma-server, gcloud-server, github, sonarqube-server, and litellm-server. These modifications are intended for testing and development, providing a flexible environment for experimenting with different configurations and integrations.
The PDFMe MCP Server is designed to manage and process tasks using the Model Context Protocol (MCP). It provides a structured way to handle and execute operations, ensuring efficient and reliable performance. This server is particularly useful for integrating with various AI tools and services, streamlining workflows and enhancing productivity.
The Memory Cache MCP Server is designed to optimize interactions with language models by caching data, thereby reducing token consumption. It works seamlessly with any MCP client and language model that uses tokens. The server automatically stores and retrieves data, ensuring faster performance and fewer tokens used for repeated operations. It features configurable settings for cache size, memory usage, and time-to-live, making it adaptable to various use cases.
This MCP server provides tools to interact with Service Nervous System (SNS) decentralized autonomous organizations (DAOs) on the Internet Computer. It implements core Model Context Protocol concepts, enabling functionalities like listing proposals, managing votable neurons, and voting on DAO proposals. The server is designed for integration with Claude Desktop and supports development workflows with auto-rebuild and debugging tools.
The Model Context Protocol CLI Server is designed to execute shell scripts or commands efficiently. It leverages the Model Context Protocol to manage and process requests, making it a versatile tool for automation and integration tasks. This server is particularly useful for developers looking to streamline command execution in a controlled environment.
Filesys is a Filesystem MCP server designed to allow Large Language Models (LLMs) to interact with local file systems. It provides seamless access to files, enabling LLMs to read and list files from a specified directory. Built with Python, it simplifies file operations and integrates with the Model Context Protocol (MCP) for efficient file management.
The Agentic MCP SSHClient is a specialized MCP server implementation designed to enhance SSH security. It includes a configurable agentic security agent that detects and prevents the execution of unsafe commands through SSH connections. The project integrates with tools like Ollama and supports configuration via secagentconfig.json, making it adaptable for various MCP client setups.
The Sefaria Jewish Library MCP Server enables retrieval and referencing of Jewish texts through a standardized interface. It supports features like retrieving texts by reference and accessing commentaries, making it a valuable tool for integrating Jewish texts into AI workflows. The server is built using Python and integrates with the Sefaria API.
This project provides a simple MCP server implementation that connects Backstage with the Model Context Protocol (MCP) using Quarkus. It allows users to list available Backstage templates and instantiate them from the command line. The server is designed to work with local AI agents like Goose, enabling seamless integration and automation of template-based workflows.