SCAST is a programmatic tool designed to assist users in analyzing and summarizing code through visualization. It leverages parsers to convert code into Abstract Syntax Trees (AST) and uses tools like Mermaid and D3 for visualization. SCAST supports multiple programming languages, including JavaScript, TypeScript, and Python, and can be integrated as an MCP server for AI clients.
LMStudio-MCP creates a bridge between Claude (with MCP capabilities) and your locally running LM Studio instance. This allows Claude to check the health of your LM Studio API, list available models, get the currently loaded model, and generate completions using your local models. It combines Claude's capabilities with your private models, enabling seamless integration and enhanced functionality.
This MCP server integrates with Replicate's FLUX model to generate images based on user prompts and stores the resulting images in Cloudflare R2. It provides accessible URLs for the generated images and supports custom prompts and filenames. The server is designed to be easily integrated with MCP-compatible clients, offering a seamless workflow for image generation and storage.
The Gemini Context MCP Server is a powerful implementation of the Model Context Protocol (MCP) that utilizes Gemini's extensive 2M token context window for advanced context management and caching. It supports session-based conversations, semantic search, and efficient API caching to optimize token usage and reduce costs. This server is compatible with MCP tools like Cursor, Claude Desktop, and VS Code, making it ideal for AI-enhanced development environments.
This package offers two MCP servers for ACI.dev: one for direct access to specific app functions and another for dynamically discovering and executing any available functions. It simplifies integration with tools like Claude Desktop and Cursor, enabling efficient function management without overloading the LLM's context window.
VRChat MCP OSC provides a bridge between AI assistants and VRChat using the Model Context Protocol (MCP) and Open Sound Control (OSC). It allows AI assistants like Claude to control avatar parameters, send messages, and respond to VR events in VRChat. The project offers seamless integration with VRChat, enabling advanced AI-driven interactions in virtual reality environments.
This project is an MCP (Model-Controller-Presenter) server implemented in TypeScript, designed to analyze GitHub Pull Requests. It provides a structured approach to handling PR data, enabling efficient analysis and integration with GitHub workflows. The server is built using Node.js and npm, with a clear project structure and scripts for building and running the server.
The AI Meta MCP Server is designed to allow AI models to extend their capabilities by defining and executing custom tools at runtime. It supports multiple runtime environments including JavaScript, Python, and Shell, and ensures security through sandboxed execution and human-in-the-loop approval. The server also features tool persistence, a flexible tool registry, and audit logging for all operations.
This project provides a FastAPI-based asynchronous API server designed to communicate with Model Context Protocol (MCP) servers. It enables users to list available MCP servers, list tools provided by these servers, and invoke specific tools. The server is built with Python and includes features like Docker support and interactive API documentation via Swagger UI and ReDoc.
The Legion Database MCP Server facilitates seamless database interactions by integrating the Legion Query Runner with the Model Context Protocol (MCP). It supports various databases, exposes database operations as MCP resources, tools, and prompts, and offers flexible deployment options. This server is ideal for AI applications requiring context-aware database access and query execution.
The BrasilAPI MCP Server is a Model Context Protocol (MCP) implementation that enables seamless querying of BrasilAPI's extensive datasets, including postal codes, area codes, banks, holidays, and taxes. It enhances AI applications by providing a unified interface to access and utilize this data, supporting integration with various clients and LLMs. The server is built with TypeScript and offers tools for development, debugging, and deployment via Docker.
This repository provides tools for automating Figma design creation and manipulation. It includes a Figma plugin for generating website components, Python scripts for direct file manipulation, and integration with MCP for enhanced functionality. Features include automated navigation bar creation, component templating, and style management.
The Enhanced PostgreSQL MCP Server is a Model Context Protocol server that extends the capabilities of the original PostgreSQL MCP server by Anthropic. It provides both read and write access to PostgreSQL databases, allowing LLMs to inspect database schemas, execute queries, modify data, and manage database schema objects. This enhanced version includes features like data modification, schema creation, and transaction handling, making it a powerful tool for integrating LLMs with PostgreSQL databases.
The Workflows MCP Server enables the creation of reusable and customizable AI workflows by combining prompts and MCP servers. It allows users to define strategies for using multiple tools in sequential or situational modes, making it easier to manage complex tasks like debugging, incident resolution, and code analysis. The server supports YAML configurations for workflows, enabling version control and team collaboration.
This MCP server enables seamless integration with Ableton Live, allowing users to programmatically manage MIDI and audio tracks. It supports features like creating MIDI tracks, adding devices, and composing MIDI notes. The server is built using JavaScript and TypeScript, and it includes tools for debugging and testing. It currently supports Ableton 11 and macOS, with plans to expand to other versions and operating systems.
The Bluesky MCP Server integrates with Bluesky's ATProtocol to enable natural language interactions with Bluesky features. It allows users to fetch posts, analyze feeds, search for content, and even create posts using an LLM-based application. This server can be added to tools like Claude Desktop, turning it into a natural language Bluesky client.